{ "cells": [ { "cell_type": "markdown", "metadata": {}, "source": [ "**Chapter 9 – Up and running with TensorFlow**" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "_This notebook contains all the sample code and solutions to the exercises in chapter 9._\n", "\n", "\n", " \n", "
\n", " Run in Google Colab\n", "
" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "**Warning**: this is the code for the 1st edition of the book. Please visit https://github.com/ageron/handson-ml2 for the 2nd edition code, with up-to-date notebooks using the latest library versions. In particular, the 1st edition is based on TensorFlow 1, while the 2nd edition uses TensorFlow 2, which is much simpler to use." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Setup" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "First, let's make sure this notebook works well in both python 2 and 3, import a few common modules, ensure MatplotLib plots figures inline and prepare a function to save the figures:" ] }, { "cell_type": "code", "execution_count": 1, "metadata": {}, "outputs": [], "source": [ "# To support both python 2 and python 3\n", "from __future__ import division, print_function, unicode_literals\n", "\n", "# Common imports\n", "import numpy as np\n", "import os\n", "\n", "try:\n", " # %tensorflow_version only exists in Colab.\n", " %tensorflow_version 1.x\n", "except Exception:\n", " pass\n", "\n", "# to make this notebook's output stable across runs\n", "def reset_graph(seed=42):\n", " tf.reset_default_graph()\n", " tf.set_random_seed(seed)\n", " np.random.seed(seed)\n", "\n", "# To plot pretty figures\n", "%matplotlib inline\n", "import matplotlib\n", "import matplotlib.pyplot as plt\n", "plt.rcParams['axes.labelsize'] = 14\n", "plt.rcParams['xtick.labelsize'] = 12\n", "plt.rcParams['ytick.labelsize'] = 12\n", "\n", "# Where to save the figures\n", "PROJECT_ROOT_DIR = \".\"\n", "CHAPTER_ID = \"tensorflow\"\n", "IMAGES_PATH = os.path.join(PROJECT_ROOT_DIR, \"images\", CHAPTER_ID)\n", "os.makedirs(IMAGES_PATH, exist_ok=True)\n", "\n", "def save_fig(fig_id, tight_layout=True, fig_extension=\"png\", resolution=300):\n", " path = os.path.join(IMAGES_PATH, fig_id + \".\" + fig_extension)\n", " print(\"Saving figure\", fig_id)\n", " if tight_layout:\n", " plt.tight_layout()\n", " plt.savefig(path, format=fig_extension, dpi=resolution)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Creating and running a graph" ] }, { "cell_type": "code", "execution_count": 2, "metadata": {}, "outputs": [], "source": [ "import tensorflow as tf\n", "\n", "reset_graph()\n", "\n", "x = tf.Variable(3, name=\"x\")\n", "y = tf.Variable(4, name=\"y\")\n", "f = x*x*y + y + 2" ] }, { "cell_type": "code", "execution_count": 3, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "" ] }, "execution_count": 3, "metadata": {}, "output_type": "execute_result" } ], "source": [ "f" ] }, { "cell_type": "code", "execution_count": 4, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "42\n" ] } ], "source": [ "sess = tf.Session()\n", "sess.run(x.initializer)\n", "sess.run(y.initializer)\n", "result = sess.run(f)\n", "print(result)" ] }, { "cell_type": "code", "execution_count": 5, "metadata": {}, "outputs": [], "source": [ "sess.close()" ] }, { "cell_type": "code", "execution_count": 6, "metadata": {}, "outputs": [], "source": [ "with tf.Session() as sess:\n", " x.initializer.run()\n", " y.initializer.run()\n", " result = f.eval()" ] }, { "cell_type": "code", "execution_count": 7, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "42" ] }, "execution_count": 7, "metadata": {}, "output_type": "execute_result" } ], "source": [ "result" ] }, { "cell_type": "code", "execution_count": 8, "metadata": {}, "outputs": [], "source": [ "init = tf.global_variables_initializer()\n", "\n", "with tf.Session() as sess:\n", " init.run()\n", " result = f.eval()" ] }, { "cell_type": "code", "execution_count": 9, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "42" ] }, "execution_count": 9, "metadata": {}, "output_type": "execute_result" } ], "source": [ "result" ] }, { "cell_type": "code", "execution_count": 10, "metadata": {}, "outputs": [], "source": [ "init = tf.global_variables_initializer()" ] }, { "cell_type": "code", "execution_count": 11, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "42\n" ] } ], "source": [ "sess = tf.InteractiveSession()\n", "init.run()\n", "result = f.eval()\n", "print(result)" ] }, { "cell_type": "code", "execution_count": 12, "metadata": {}, "outputs": [], "source": [ "sess.close()" ] }, { "cell_type": "code", "execution_count": 13, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "42" ] }, "execution_count": 13, "metadata": {}, "output_type": "execute_result" } ], "source": [ "result" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Managing graphs" ] }, { "cell_type": "code", "execution_count": 14, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "True" ] }, "execution_count": 14, "metadata": {}, "output_type": "execute_result" } ], "source": [ "reset_graph()\n", "\n", "x1 = tf.Variable(1)\n", "x1.graph is tf.get_default_graph()" ] }, { "cell_type": "code", "execution_count": 15, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "True" ] }, "execution_count": 15, "metadata": {}, "output_type": "execute_result" } ], "source": [ "graph = tf.Graph()\n", "with graph.as_default():\n", " x2 = tf.Variable(2)\n", "\n", "x2.graph is graph" ] }, { "cell_type": "code", "execution_count": 16, "metadata": { "scrolled": true }, "outputs": [ { "data": { "text/plain": [ "False" ] }, "execution_count": 16, "metadata": {}, "output_type": "execute_result" } ], "source": [ "x2.graph is tf.get_default_graph()" ] }, { "cell_type": "code", "execution_count": 17, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "10\n", "15\n" ] } ], "source": [ "w = tf.constant(3)\n", "x = w + 2\n", "y = x + 5\n", "z = x * 3\n", "\n", "with tf.Session() as sess:\n", " print(y.eval()) # 10\n", " print(z.eval()) # 15" ] }, { "cell_type": "code", "execution_count": 18, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "10\n", "15\n" ] } ], "source": [ "with tf.Session() as sess:\n", " y_val, z_val = sess.run([y, z])\n", " print(y_val) # 10\n", " print(z_val) # 15" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Linear Regression" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Using the Normal Equation" ] }, { "cell_type": "code", "execution_count": 19, "metadata": {}, "outputs": [], "source": [ "import numpy as np\n", "from sklearn.datasets import fetch_california_housing\n", "\n", "reset_graph()\n", "\n", "housing = fetch_california_housing()\n", "m, n = housing.data.shape\n", "housing_data_plus_bias = np.c_[np.ones((m, 1)), housing.data]\n", "\n", "X = tf.constant(housing_data_plus_bias, dtype=tf.float32, name=\"X\")\n", "y = tf.constant(housing.target.reshape(-1, 1), dtype=tf.float32, name=\"y\")\n", "XT = tf.transpose(X)\n", "theta = tf.matmul(tf.matmul(tf.matrix_inverse(tf.matmul(XT, X)), XT), y)\n", "\n", "with tf.Session() as sess:\n", " theta_value = theta.eval()" ] }, { "cell_type": "code", "execution_count": 20, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "array([[-3.7185181e+01],\n", " [ 4.3633747e-01],\n", " [ 9.3952334e-03],\n", " [-1.0711310e-01],\n", " [ 6.4479220e-01],\n", " [-4.0338000e-06],\n", " [-3.7813708e-03],\n", " [-4.2348403e-01],\n", " [-4.3721911e-01]], dtype=float32)" ] }, "execution_count": 20, "metadata": {}, "output_type": "execute_result" } ], "source": [ "theta_value" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Compare with pure NumPy" ] }, { "cell_type": "code", "execution_count": 21, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "[[-3.69419202e+01]\n", " [ 4.36693293e-01]\n", " [ 9.43577803e-03]\n", " [-1.07322041e-01]\n", " [ 6.45065694e-01]\n", " [-3.97638942e-06]\n", " [-3.78654265e-03]\n", " [-4.21314378e-01]\n", " [-4.34513755e-01]]\n" ] } ], "source": [ "X = housing_data_plus_bias\n", "y = housing.target.reshape(-1, 1)\n", "theta_numpy = np.linalg.inv(X.T.dot(X)).dot(X.T).dot(y)\n", "\n", "print(theta_numpy)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Compare with Scikit-Learn" ] }, { "cell_type": "code", "execution_count": 22, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "[[-3.69419202e+01]\n", " [ 4.36693293e-01]\n", " [ 9.43577803e-03]\n", " [-1.07322041e-01]\n", " [ 6.45065694e-01]\n", " [-3.97638942e-06]\n", " [-3.78654265e-03]\n", " [-4.21314378e-01]\n", " [-4.34513755e-01]]\n" ] } ], "source": [ "from sklearn.linear_model import LinearRegression\n", "lin_reg = LinearRegression()\n", "lin_reg.fit(housing.data, housing.target.reshape(-1, 1))\n", "\n", "print(np.r_[lin_reg.intercept_.reshape(-1, 1), lin_reg.coef_.T])" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Using Batch Gradient Descent" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Gradient Descent requires scaling the feature vectors first. We could do this using TF, but let's just use Scikit-Learn for now." ] }, { "cell_type": "code", "execution_count": 23, "metadata": {}, "outputs": [], "source": [ "from sklearn.preprocessing import StandardScaler\n", "scaler = StandardScaler()\n", "scaled_housing_data = scaler.fit_transform(housing.data)\n", "scaled_housing_data_plus_bias = np.c_[np.ones((m, 1)), scaled_housing_data]" ] }, { "cell_type": "code", "execution_count": 24, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "[ 1.00000000e+00 6.60969987e-17 5.50808322e-18 6.60969987e-17\n", " -1.06030602e-16 -1.10161664e-17 3.44255201e-18 -1.07958431e-15\n", " -8.52651283e-15]\n", "[ 0.38915536 0.36424355 0.5116157 ... -0.06612179 -0.06360587\n", " 0.01359031]\n", "0.11111111111111005\n", "(20640, 9)\n" ] } ], "source": [ "print(scaled_housing_data_plus_bias.mean(axis=0))\n", "print(scaled_housing_data_plus_bias.mean(axis=1))\n", "print(scaled_housing_data_plus_bias.mean())\n", "print(scaled_housing_data_plus_bias.shape)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Manually computing the gradients" ] }, { "cell_type": "code", "execution_count": 25, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Epoch 0 MSE = 9.161542\n", "Epoch 100 MSE = 0.7145004\n", "Epoch 200 MSE = 0.56670487\n", "Epoch 300 MSE = 0.5555718\n", "Epoch 400 MSE = 0.5488112\n", "Epoch 500 MSE = 0.5436363\n", "Epoch 600 MSE = 0.5396291\n", "Epoch 700 MSE = 0.5365092\n", "Epoch 800 MSE = 0.53406775\n", "Epoch 900 MSE = 0.5321473\n" ] } ], "source": [ "reset_graph()\n", "\n", "n_epochs = 1000\n", "learning_rate = 0.01\n", "\n", "X = tf.constant(scaled_housing_data_plus_bias, dtype=tf.float32, name=\"X\")\n", "y = tf.constant(housing.target.reshape(-1, 1), dtype=tf.float32, name=\"y\")\n", "theta = tf.Variable(tf.random_uniform([n + 1, 1], -1.0, 1.0, seed=42), name=\"theta\")\n", "y_pred = tf.matmul(X, theta, name=\"predictions\")\n", "error = y_pred - y\n", "mse = tf.reduce_mean(tf.square(error), name=\"mse\")\n", "gradients = 2/m * tf.matmul(tf.transpose(X), error)\n", "training_op = tf.assign(theta, theta - learning_rate * gradients)\n", "\n", "init = tf.global_variables_initializer()\n", "\n", "with tf.Session() as sess:\n", " sess.run(init)\n", "\n", " for epoch in range(n_epochs):\n", " if epoch % 100 == 0:\n", " print(\"Epoch\", epoch, \"MSE =\", mse.eval())\n", " sess.run(training_op)\n", " \n", " best_theta = theta.eval()" ] }, { "cell_type": "code", "execution_count": 26, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "array([[ 2.0685523 ],\n", " [ 0.8874027 ],\n", " [ 0.14401656],\n", " [-0.34770882],\n", " [ 0.36178368],\n", " [ 0.00393811],\n", " [-0.04269556],\n", " [-0.6614529 ],\n", " [-0.6375279 ]], dtype=float32)" ] }, "execution_count": 26, "metadata": {}, "output_type": "execute_result" } ], "source": [ "best_theta" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Using autodiff" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Same as above except for the `gradients = ...` line:" ] }, { "cell_type": "code", "execution_count": 27, "metadata": {}, "outputs": [], "source": [ "reset_graph()\n", "\n", "n_epochs = 1000\n", "learning_rate = 0.01\n", "\n", "X = tf.constant(scaled_housing_data_plus_bias, dtype=tf.float32, name=\"X\")\n", "y = tf.constant(housing.target.reshape(-1, 1), dtype=tf.float32, name=\"y\")\n", "theta = tf.Variable(tf.random_uniform([n + 1, 1], -1.0, 1.0, seed=42), name=\"theta\")\n", "y_pred = tf.matmul(X, theta, name=\"predictions\")\n", "error = y_pred - y\n", "mse = tf.reduce_mean(tf.square(error), name=\"mse\")" ] }, { "cell_type": "code", "execution_count": 28, "metadata": {}, "outputs": [], "source": [ "gradients = tf.gradients(mse, [theta])[0]" ] }, { "cell_type": "code", "execution_count": 29, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Epoch 0 MSE = 9.161542\n", "Epoch 100 MSE = 0.71450037\n", "Epoch 200 MSE = 0.56670487\n", "Epoch 300 MSE = 0.5555718\n", "Epoch 400 MSE = 0.54881126\n", "Epoch 500 MSE = 0.5436363\n", "Epoch 600 MSE = 0.53962916\n", "Epoch 700 MSE = 0.5365092\n", "Epoch 800 MSE = 0.53406775\n", "Epoch 900 MSE = 0.5321473\n", "Best theta:\n", "[[ 2.0685523 ]\n", " [ 0.8874027 ]\n", " [ 0.14401656]\n", " [-0.3477088 ]\n", " [ 0.36178365]\n", " [ 0.00393811]\n", " [-0.04269556]\n", " [-0.66145283]\n", " [-0.6375278 ]]\n" ] } ], "source": [ "training_op = tf.assign(theta, theta - learning_rate * gradients)\n", "\n", "init = tf.global_variables_initializer()\n", "\n", "with tf.Session() as sess:\n", " sess.run(init)\n", "\n", " for epoch in range(n_epochs):\n", " if epoch % 100 == 0:\n", " print(\"Epoch\", epoch, \"MSE =\", mse.eval())\n", " sess.run(training_op)\n", " \n", " best_theta = theta.eval()\n", "\n", "print(\"Best theta:\")\n", "print(best_theta)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "How could you find the partial derivatives of the following function with regards to `a` and `b`?" ] }, { "cell_type": "code", "execution_count": 30, "metadata": {}, "outputs": [], "source": [ "def my_func(a, b):\n", " z = 0\n", " for i in range(100):\n", " z = a * np.cos(z + i) + z * np.sin(b - i)\n", " return z" ] }, { "cell_type": "code", "execution_count": 31, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "-0.21253923284754914" ] }, "execution_count": 31, "metadata": {}, "output_type": "execute_result" } ], "source": [ "my_func(0.2, 0.3)" ] }, { "cell_type": "code", "execution_count": 32, "metadata": {}, "outputs": [], "source": [ "reset_graph()\n", "\n", "a = tf.Variable(0.2, name=\"a\")\n", "b = tf.Variable(0.3, name=\"b\")\n", "z = tf.constant(0.0, name=\"z0\")\n", "for i in range(100):\n", " z = a * tf.cos(z + i) + z * tf.sin(b - i)\n", "\n", "grads = tf.gradients(z, [a, b])\n", "init = tf.global_variables_initializer()" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Let's compute the function at $a=0.2$ and $b=0.3$, and the partial derivatives at that point with regards to $a$ and with regards to $b$:" ] }, { "cell_type": "code", "execution_count": 33, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "-0.21253741\n", "[-1.1388495, 0.19671397]\n" ] } ], "source": [ "with tf.Session() as sess:\n", " init.run()\n", " print(z.eval())\n", " print(sess.run(grads))" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Using a `GradientDescentOptimizer`" ] }, { "cell_type": "code", "execution_count": 34, "metadata": {}, "outputs": [], "source": [ "reset_graph()\n", "\n", "n_epochs = 1000\n", "learning_rate = 0.01\n", "\n", "X = tf.constant(scaled_housing_data_plus_bias, dtype=tf.float32, name=\"X\")\n", "y = tf.constant(housing.target.reshape(-1, 1), dtype=tf.float32, name=\"y\")\n", "theta = tf.Variable(tf.random_uniform([n + 1, 1], -1.0, 1.0, seed=42), name=\"theta\")\n", "y_pred = tf.matmul(X, theta, name=\"predictions\")\n", "error = y_pred - y\n", "mse = tf.reduce_mean(tf.square(error), name=\"mse\")" ] }, { "cell_type": "code", "execution_count": 35, "metadata": {}, "outputs": [], "source": [ "optimizer = tf.train.GradientDescentOptimizer(learning_rate=learning_rate)\n", "training_op = optimizer.minimize(mse)" ] }, { "cell_type": "code", "execution_count": 36, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Epoch 0 MSE = 9.161542\n", "Epoch 100 MSE = 0.7145004\n", "Epoch 200 MSE = 0.56670487\n", "Epoch 300 MSE = 0.5555718\n", "Epoch 400 MSE = 0.54881126\n", "Epoch 500 MSE = 0.5436363\n", "Epoch 600 MSE = 0.53962916\n", "Epoch 700 MSE = 0.5365092\n", "Epoch 800 MSE = 0.53406775\n", "Epoch 900 MSE = 0.5321473\n", "Best theta:\n", "[[ 2.0685523 ]\n", " [ 0.8874027 ]\n", " [ 0.14401656]\n", " [-0.3477088 ]\n", " [ 0.36178365]\n", " [ 0.00393811]\n", " [-0.04269556]\n", " [-0.66145283]\n", " [-0.6375278 ]]\n" ] } ], "source": [ "init = tf.global_variables_initializer()\n", "\n", "with tf.Session() as sess:\n", " sess.run(init)\n", "\n", " for epoch in range(n_epochs):\n", " if epoch % 100 == 0:\n", " print(\"Epoch\", epoch, \"MSE =\", mse.eval())\n", " sess.run(training_op)\n", " \n", " best_theta = theta.eval()\n", "\n", "print(\"Best theta:\")\n", "print(best_theta)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Using a momentum optimizer" ] }, { "cell_type": "code", "execution_count": 37, "metadata": {}, "outputs": [], "source": [ "reset_graph()\n", "\n", "n_epochs = 1000\n", "learning_rate = 0.01\n", "\n", "X = tf.constant(scaled_housing_data_plus_bias, dtype=tf.float32, name=\"X\")\n", "y = tf.constant(housing.target.reshape(-1, 1), dtype=tf.float32, name=\"y\")\n", "theta = tf.Variable(tf.random_uniform([n + 1, 1], -1.0, 1.0, seed=42), name=\"theta\")\n", "y_pred = tf.matmul(X, theta, name=\"predictions\")\n", "error = y_pred - y\n", "mse = tf.reduce_mean(tf.square(error), name=\"mse\")" ] }, { "cell_type": "code", "execution_count": 38, "metadata": {}, "outputs": [], "source": [ "optimizer = tf.train.MomentumOptimizer(learning_rate=learning_rate,\n", " momentum=0.9)" ] }, { "cell_type": "code", "execution_count": 39, "metadata": {}, "outputs": [], "source": [ "training_op = optimizer.minimize(mse)\n", "\n", "init = tf.global_variables_initializer()" ] }, { "cell_type": "code", "execution_count": 40, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Best theta:\n", "[[ 2.068558 ]\n", " [ 0.82962847]\n", " [ 0.11875335]\n", " [-0.26554456]\n", " [ 0.3057109 ]\n", " [-0.00450249]\n", " [-0.03932662]\n", " [-0.8998645 ]\n", " [-0.8705207 ]]\n" ] } ], "source": [ "with tf.Session() as sess:\n", " sess.run(init)\n", "\n", " for epoch in range(n_epochs):\n", " sess.run(training_op)\n", " \n", " best_theta = theta.eval()\n", "\n", "print(\"Best theta:\")\n", "print(best_theta)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Feeding data to the training algorithm" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Placeholder nodes" ] }, { "cell_type": "code", "execution_count": 41, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "[[6. 7. 8.]]\n" ] } ], "source": [ "reset_graph()\n", "\n", "A = tf.placeholder(tf.float32, shape=(None, 3))\n", "B = A + 5\n", "with tf.Session() as sess:\n", " B_val_1 = B.eval(feed_dict={A: [[1, 2, 3]]})\n", " B_val_2 = B.eval(feed_dict={A: [[4, 5, 6], [7, 8, 9]]})\n", "\n", "print(B_val_1)" ] }, { "cell_type": "code", "execution_count": 42, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "[[ 9. 10. 11.]\n", " [12. 13. 14.]]\n" ] } ], "source": [ "print(B_val_2)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Mini-batch Gradient Descent" ] }, { "cell_type": "code", "execution_count": 43, "metadata": {}, "outputs": [], "source": [ "n_epochs = 1000\n", "learning_rate = 0.01" ] }, { "cell_type": "code", "execution_count": 44, "metadata": {}, "outputs": [], "source": [ "reset_graph()\n", "\n", "X = tf.placeholder(tf.float32, shape=(None, n + 1), name=\"X\")\n", "y = tf.placeholder(tf.float32, shape=(None, 1), name=\"y\")" ] }, { "cell_type": "code", "execution_count": 45, "metadata": {}, "outputs": [], "source": [ "theta = tf.Variable(tf.random_uniform([n + 1, 1], -1.0, 1.0, seed=42), name=\"theta\")\n", "y_pred = tf.matmul(X, theta, name=\"predictions\")\n", "error = y_pred - y\n", "mse = tf.reduce_mean(tf.square(error), name=\"mse\")\n", "optimizer = tf.train.GradientDescentOptimizer(learning_rate=learning_rate)\n", "training_op = optimizer.minimize(mse)\n", "\n", "init = tf.global_variables_initializer()" ] }, { "cell_type": "code", "execution_count": 46, "metadata": {}, "outputs": [], "source": [ "n_epochs = 10" ] }, { "cell_type": "code", "execution_count": 47, "metadata": {}, "outputs": [], "source": [ "batch_size = 100\n", "n_batches = int(np.ceil(m / batch_size))" ] }, { "cell_type": "code", "execution_count": 48, "metadata": {}, "outputs": [], "source": [ "def fetch_batch(epoch, batch_index, batch_size):\n", " np.random.seed(epoch * n_batches + batch_index) # not shown in the book\n", " indices = np.random.randint(m, size=batch_size) # not shown\n", " X_batch = scaled_housing_data_plus_bias[indices] # not shown\n", " y_batch = housing.target.reshape(-1, 1)[indices] # not shown\n", " return X_batch, y_batch\n", "\n", "with tf.Session() as sess:\n", " sess.run(init)\n", "\n", " for epoch in range(n_epochs):\n", " for batch_index in range(n_batches):\n", " X_batch, y_batch = fetch_batch(epoch, batch_index, batch_size)\n", " sess.run(training_op, feed_dict={X: X_batch, y: y_batch})\n", "\n", " best_theta = theta.eval()" ] }, { "cell_type": "code", "execution_count": 49, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "array([[ 2.0703337 ],\n", " [ 0.8637145 ],\n", " [ 0.12255152],\n", " [-0.31211877],\n", " [ 0.38510376],\n", " [ 0.00434168],\n", " [-0.0123295 ],\n", " [-0.83376896],\n", " [-0.8030471 ]], dtype=float32)" ] }, "execution_count": 49, "metadata": {}, "output_type": "execute_result" } ], "source": [ "best_theta" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Saving and restoring a model" ] }, { "cell_type": "code", "execution_count": 50, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Epoch 0 MSE = 9.161542\n", "Epoch 100 MSE = 0.7145004\n", "Epoch 200 MSE = 0.56670487\n", "Epoch 300 MSE = 0.5555718\n", "Epoch 400 MSE = 0.54881126\n", "Epoch 500 MSE = 0.5436363\n", "Epoch 600 MSE = 0.53962916\n", "Epoch 700 MSE = 0.5365092\n", "Epoch 800 MSE = 0.53406775\n", "Epoch 900 MSE = 0.5321473\n" ] } ], "source": [ "reset_graph()\n", "\n", "n_epochs = 1000 # not shown in the book\n", "learning_rate = 0.01 # not shown\n", "\n", "X = tf.constant(scaled_housing_data_plus_bias, dtype=tf.float32, name=\"X\") # not shown\n", "y = tf.constant(housing.target.reshape(-1, 1), dtype=tf.float32, name=\"y\") # not shown\n", "theta = tf.Variable(tf.random_uniform([n + 1, 1], -1.0, 1.0, seed=42), name=\"theta\")\n", "y_pred = tf.matmul(X, theta, name=\"predictions\") # not shown\n", "error = y_pred - y # not shown\n", "mse = tf.reduce_mean(tf.square(error), name=\"mse\") # not shown\n", "optimizer = tf.train.GradientDescentOptimizer(learning_rate=learning_rate) # not shown\n", "training_op = optimizer.minimize(mse) # not shown\n", "\n", "init = tf.global_variables_initializer()\n", "saver = tf.train.Saver()\n", "\n", "with tf.Session() as sess:\n", " sess.run(init)\n", "\n", " for epoch in range(n_epochs):\n", " if epoch % 100 == 0:\n", " print(\"Epoch\", epoch, \"MSE =\", mse.eval()) # not shown\n", " save_path = saver.save(sess, \"/tmp/my_model.ckpt\")\n", " sess.run(training_op)\n", " \n", " best_theta = theta.eval()\n", " save_path = saver.save(sess, \"/tmp/my_model_final.ckpt\")" ] }, { "cell_type": "code", "execution_count": 51, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "array([[ 2.0685523 ],\n", " [ 0.8874027 ],\n", " [ 0.14401656],\n", " [-0.3477088 ],\n", " [ 0.36178365],\n", " [ 0.00393811],\n", " [-0.04269556],\n", " [-0.66145283],\n", " [-0.6375278 ]], dtype=float32)" ] }, "execution_count": 51, "metadata": {}, "output_type": "execute_result" } ], "source": [ "best_theta" ] }, { "cell_type": "code", "execution_count": 52, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "INFO:tensorflow:Restoring parameters from /tmp/my_model_final.ckpt\n" ] } ], "source": [ "with tf.Session() as sess:\n", " saver.restore(sess, \"/tmp/my_model_final.ckpt\")\n", " best_theta_restored = theta.eval() # not shown in the book" ] }, { "cell_type": "code", "execution_count": 53, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "True" ] }, "execution_count": 53, "metadata": {}, "output_type": "execute_result" } ], "source": [ "np.allclose(best_theta, best_theta_restored)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "If you want to have a saver that loads and restores `theta` with a different name, such as `\"weights\"`:" ] }, { "cell_type": "code", "execution_count": 54, "metadata": {}, "outputs": [], "source": [ "saver = tf.train.Saver({\"weights\": theta})" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "By default the saver also saves the graph structure itself in a second file with the extension `.meta`. You can use the function `tf.train.import_meta_graph()` to restore the graph structure. This function loads the graph into the default graph and returns a `Saver` that can then be used to restore the graph state (i.e., the variable values):" ] }, { "cell_type": "code", "execution_count": 55, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "INFO:tensorflow:Restoring parameters from /tmp/my_model_final.ckpt\n" ] } ], "source": [ "reset_graph()\n", "# notice that we start with an empty graph.\n", "\n", "saver = tf.train.import_meta_graph(\"/tmp/my_model_final.ckpt.meta\") # this loads the graph structure\n", "theta = tf.get_default_graph().get_tensor_by_name(\"theta:0\") # not shown in the book\n", "\n", "with tf.Session() as sess:\n", " saver.restore(sess, \"/tmp/my_model_final.ckpt\") # this restores the graph's state\n", " best_theta_restored = theta.eval() # not shown in the book" ] }, { "cell_type": "code", "execution_count": 56, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "True" ] }, "execution_count": 56, "metadata": {}, "output_type": "execute_result" } ], "source": [ "np.allclose(best_theta, best_theta_restored)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "This means that you can import a pretrained model without having to have the corresponding Python code to build the graph. This is very handy when you keep tweaking and saving your model: you can load a previously saved model without having to search for the version of the code that built it." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Visualizing the graph" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "TensorBoard is a great tool to visualize TensorFlow graphs, training curves, and much more. Our TensorFlow code will write various files in a log directory, and the TensorBoard server will regularly read these files and produce nice interactive visualizations. It can plot graphs, learning curves (i.e., how the loss evaluated on the training set or test set evolves as a function of the epoch number), profiling data to identify performance bottlenecks, and more. In short, it helps keep track of everything. Here's the overall picture:\n", "\n", "`TensorFlow writes logs to ===> log directory ===> TensorBoard reads data and displays visualizations`" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "If we want to visualize different graphs, or learning curves for different training runs, we don't want the log files to get all mixed up. So we will need one log subdirectory per graph, or per run. Let's use a root log directory that we will call `tf_logs`, and a sub-directory that we will call `run-` followed by the current timestamp (you can use any other name you prefer in your own code):" ] }, { "cell_type": "code", "execution_count": 57, "metadata": {}, "outputs": [], "source": [ "from datetime import datetime\n", "\n", "now = datetime.utcnow().strftime(\"%Y%m%d%H%M%S\")\n", "root_logdir = \"tf_logs\"\n", "logdir = \"{}/run-{}/\".format(root_logdir, now)" ] }, { "cell_type": "code", "execution_count": 58, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "'tf_logs/run-20210325095200/'" ] }, "execution_count": 58, "metadata": {}, "output_type": "execute_result" } ], "source": [ "logdir" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "In fact, let's create a function that will generate such a subdirectory path every time we need one:" ] }, { "cell_type": "code", "execution_count": 59, "metadata": {}, "outputs": [], "source": [ "def make_log_subdir(run_id=None):\n", " if run_id is None:\n", " run_id = datetime.utcnow().strftime(\"%Y%m%d%H%M%S\")\n", " return \"{}/run-{}/\".format(root_logdir, run_id)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Now let's save the default graph to our log subdirectory using `tf.summary.FileWriter()`:" ] }, { "cell_type": "code", "execution_count": 60, "metadata": {}, "outputs": [], "source": [ "file_writer = tf.summary.FileWriter(logdir, graph=tf.get_default_graph())" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Now the root log directory contains one subdirectory:" ] }, { "cell_type": "code", "execution_count": 61, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "['run-20210325095200']" ] }, "execution_count": 61, "metadata": {}, "output_type": "execute_result" } ], "source": [ "os.listdir(root_logdir)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "And this subdirectory contains one log file (called a \"TF events\" file) for the graph:" ] }, { "cell_type": "code", "execution_count": 62, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "['events.out.tfevents.1616665937.kiwimac']" ] }, "execution_count": 62, "metadata": {}, "output_type": "execute_result" } ], "source": [ "os.listdir(logdir)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "However, the actual graph data may still be in the OS's file cache, so we need to `flush()` or `close()` the `FileWriter` to be sure that it's well written to disk:" ] }, { "cell_type": "code", "execution_count": 63, "metadata": {}, "outputs": [], "source": [ "file_writer.close()" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Okay, now let's start TensorBoard! It runs as a web server in a separate process, so we first need to start it. One way to do that is to run the `tensorboard` command in a terminal window. Another is to use the `%tensorboard` Jupyter extension, which takes care of starting TensorBoard, and it allows us to view TensorBoard's user interface directly within Jupyter. Let's load this extension now:" ] }, { "cell_type": "code", "execution_count": 64, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "The tensorboard extension is already loaded. To reload it, use:\n", " %reload_ext tensorboard\n" ] } ], "source": [ "%load_ext tensorboard" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Next, let's use the `%tensorboard` extension to start the TensorBoard server. We need to point it to the root log directory:" ] }, { "cell_type": "code", "execution_count": 65, "metadata": { "scrolled": true }, "outputs": [ { "data": { "text/html": [ "\n", " \n", " \n", " " ], "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "%tensorboard --logdir {root_logdir}" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Great! We can now visualize graphs. :)\n", "\n", "In fact, let's make this easy by creating a `save_graph()` function that will automatically create a new log subdir and save the given graph (by default `tf.get_default_graph()`) to this directory:" ] }, { "cell_type": "code", "execution_count": 66, "metadata": {}, "outputs": [], "source": [ "def save_graph(graph=None, run_id=None):\n", " if graph is None:\n", " graph = tf.get_default_graph()\n", " logdir = make_log_subdir(run_id)\n", " file_writer = tf.summary.FileWriter(logdir, graph=graph)\n", " file_writer.close()\n", " return logdir" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Let's see if it works:" ] }, { "cell_type": "code", "execution_count": 67, "metadata": { "scrolled": true }, "outputs": [ { "data": { "text/plain": [ "'tf_logs/run-20210325095244/'" ] }, "execution_count": 67, "metadata": {}, "output_type": "execute_result" } ], "source": [ "save_graph()" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Now let's look at TensorBoard again. Note that this will reuse the existing TensorBoard server since we're reusing the same root log directory:" ] }, { "cell_type": "code", "execution_count": 68, "metadata": { "scrolled": false }, "outputs": [ { "data": { "text/plain": [ "Reusing TensorBoard on port 6006 (pid 43590), started 0:00:45 ago. (Use '!kill 43590' to kill it.)" ] }, "metadata": {}, "output_type": "display_data" }, { "data": { "text/html": [ "\n", " \n", " \n", " " ], "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "%tensorboard --logdir {root_logdir}" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Notice that you can switch between runs by picking the log subdirectory you want from the \"Run\" dropdown list (at the top left)." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Visualizing Learning Curves\n", "\n", "Now let's see how to visualize learning curves:" ] }, { "cell_type": "code", "execution_count": 69, "metadata": {}, "outputs": [], "source": [ "reset_graph()\n", "\n", "n_epochs = 1000\n", "learning_rate = 0.01\n", "\n", "X = tf.placeholder(tf.float32, shape=(None, n + 1), name=\"X\")\n", "y = tf.placeholder(tf.float32, shape=(None, 1), name=\"y\")\n", "theta = tf.Variable(tf.random_uniform([n + 1, 1], -1.0, 1.0, seed=42), name=\"theta\")\n", "y_pred = tf.matmul(X, theta, name=\"predictions\")\n", "error = y_pred - y\n", "mse = tf.reduce_mean(tf.square(error), name=\"mse\")\n", "optimizer = tf.train.GradientDescentOptimizer(learning_rate=learning_rate)\n", "training_op = optimizer.minimize(mse)\n", "\n", "init = tf.global_variables_initializer()" ] }, { "cell_type": "code", "execution_count": 70, "metadata": {}, "outputs": [], "source": [ "logdir = make_log_subdir()" ] }, { "cell_type": "code", "execution_count": 71, "metadata": {}, "outputs": [], "source": [ "mse_summary = tf.summary.scalar('MSE', mse)\n", "file_writer = tf.summary.FileWriter(logdir, tf.get_default_graph())" ] }, { "cell_type": "code", "execution_count": 72, "metadata": {}, "outputs": [], "source": [ "n_epochs = 10\n", "batch_size = 100\n", "n_batches = int(np.ceil(m / batch_size))" ] }, { "cell_type": "code", "execution_count": 73, "metadata": {}, "outputs": [], "source": [ "with tf.Session() as sess: # not shown in the book\n", " sess.run(init) # not shown\n", "\n", " for epoch in range(n_epochs): # not shown\n", " for batch_index in range(n_batches):\n", " X_batch, y_batch = fetch_batch(epoch, batch_index, batch_size)\n", " if batch_index % 10 == 0:\n", " summary_str = mse_summary.eval(feed_dict={X: X_batch, y: y_batch})\n", " step = epoch * n_batches + batch_index\n", " file_writer.add_summary(summary_str, step)\n", " sess.run(training_op, feed_dict={X: X_batch, y: y_batch})\n", "\n", " best_theta = theta.eval() # not shown" ] }, { "cell_type": "code", "execution_count": 74, "metadata": {}, "outputs": [], "source": [ "file_writer.close()" ] }, { "cell_type": "code", "execution_count": 75, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "array([[ 2.07033372],\n", " [ 0.86371452],\n", " [ 0.12255151],\n", " [-0.31211874],\n", " [ 0.38510373],\n", " [ 0.00434168],\n", " [-0.01232954],\n", " [-0.83376896],\n", " [-0.80304712]], dtype=float32)" ] }, "execution_count": 75, "metadata": {}, "output_type": "execute_result" } ], "source": [ "best_theta" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Now let's look at TensorBoard. Try going to the SCALARS tab:" ] }, { "cell_type": "code", "execution_count": 76, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "Reusing TensorBoard on port 6006 (pid 43590), started 0:02:08 ago. (Use '!kill 43590' to kill it.)" ] }, "metadata": {}, "output_type": "display_data" }, { "data": { "text/html": [ "\n", " \n", " \n", " " ], "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "%tensorboard --logdir {root_logdir}" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Name scopes" ] }, { "cell_type": "code", "execution_count": 77, "metadata": {}, "outputs": [], "source": [ "reset_graph()\n", "\n", "n_epochs = 1000\n", "learning_rate = 0.01\n", "\n", "X = tf.placeholder(tf.float32, shape=(None, n + 1), name=\"X\")\n", "y = tf.placeholder(tf.float32, shape=(None, 1), name=\"y\")\n", "theta = tf.Variable(tf.random_uniform([n + 1, 1], -1.0, 1.0, seed=42), name=\"theta\")\n", "y_pred = tf.matmul(X, theta, name=\"predictions\")" ] }, { "cell_type": "code", "execution_count": 78, "metadata": {}, "outputs": [], "source": [ "with tf.name_scope(\"loss\") as scope:\n", " error = y_pred - y\n", " mse = tf.reduce_mean(tf.square(error), name=\"mse\")" ] }, { "cell_type": "code", "execution_count": 79, "metadata": {}, "outputs": [], "source": [ "optimizer = tf.train.GradientDescentOptimizer(learning_rate=learning_rate)\n", "training_op = optimizer.minimize(mse)\n", "\n", "init = tf.global_variables_initializer()\n", "\n", "mse_summary = tf.summary.scalar('MSE', mse)\n", "\n", "logdir = make_log_subdir()\n", "file_writer = tf.summary.FileWriter(logdir, tf.get_default_graph())" ] }, { "cell_type": "code", "execution_count": 80, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Best theta:\n", "[[ 2.07033372]\n", " [ 0.86371452]\n", " [ 0.12255151]\n", " [-0.31211874]\n", " [ 0.38510373]\n", " [ 0.00434168]\n", " [-0.01232954]\n", " [-0.83376896]\n", " [-0.80304712]]\n" ] } ], "source": [ "n_epochs = 10\n", "batch_size = 100\n", "n_batches = int(np.ceil(m / batch_size))\n", "\n", "with tf.Session() as sess:\n", " sess.run(init)\n", "\n", " for epoch in range(n_epochs):\n", " for batch_index in range(n_batches):\n", " X_batch, y_batch = fetch_batch(epoch, batch_index, batch_size)\n", " if batch_index % 10 == 0:\n", " summary_str = mse_summary.eval(feed_dict={X: X_batch, y: y_batch})\n", " step = epoch * n_batches + batch_index\n", " file_writer.add_summary(summary_str, step)\n", " sess.run(training_op, feed_dict={X: X_batch, y: y_batch})\n", "\n", " best_theta = theta.eval()\n", "\n", "file_writer.flush()\n", "file_writer.close()\n", "print(\"Best theta:\")\n", "print(best_theta)" ] }, { "cell_type": "code", "execution_count": 81, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "loss/sub\n" ] } ], "source": [ "print(error.op.name)" ] }, { "cell_type": "code", "execution_count": 82, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "loss/mse\n" ] } ], "source": [ "print(mse.op.name)" ] }, { "cell_type": "code", "execution_count": 83, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "a\n", "a_1\n", "param/a\n", "param_1/a\n" ] } ], "source": [ "reset_graph()\n", "\n", "a1 = tf.Variable(0, name=\"a\") # name == \"a\"\n", "a2 = tf.Variable(0, name=\"a\") # name == \"a_1\"\n", "\n", "with tf.name_scope(\"param\"): # name == \"param\"\n", " a3 = tf.Variable(0, name=\"a\") # name == \"param/a\"\n", "\n", "with tf.name_scope(\"param\"): # name == \"param_1\"\n", " a4 = tf.Variable(0, name=\"a\") # name == \"param_1/a\"\n", "\n", "for node in (a1, a2, a3, a4):\n", " print(node.op.name)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Modularity" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "An ugly flat code:" ] }, { "cell_type": "code", "execution_count": 84, "metadata": {}, "outputs": [], "source": [ "reset_graph()\n", "\n", "n_features = 3\n", "X = tf.placeholder(tf.float32, shape=(None, n_features), name=\"X\")\n", "\n", "w1 = tf.Variable(tf.random_normal((n_features, 1)), name=\"weights1\")\n", "w2 = tf.Variable(tf.random_normal((n_features, 1)), name=\"weights2\")\n", "b1 = tf.Variable(0.0, name=\"bias1\")\n", "b2 = tf.Variable(0.0, name=\"bias2\")\n", "\n", "z1 = tf.add(tf.matmul(X, w1), b1, name=\"z1\")\n", "z2 = tf.add(tf.matmul(X, w2), b2, name=\"z2\")\n", "\n", "relu1 = tf.maximum(z1, 0., name=\"relu1\")\n", "relu2 = tf.maximum(z1, 0., name=\"relu2\") # Oops, cut&paste error! Did you spot it?\n", "\n", "output = tf.add(relu1, relu2, name=\"output\")" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Much better, using a function to build the ReLUs:" ] }, { "cell_type": "code", "execution_count": 85, "metadata": {}, "outputs": [], "source": [ "reset_graph()\n", "\n", "def relu(X):\n", " w_shape = (int(X.get_shape()[1]), 1)\n", " w = tf.Variable(tf.random_normal(w_shape), name=\"weights\")\n", " b = tf.Variable(0.0, name=\"bias\")\n", " z = tf.add(tf.matmul(X, w), b, name=\"z\")\n", " return tf.maximum(z, 0., name=\"relu\")\n", "\n", "n_features = 3\n", "X = tf.placeholder(tf.float32, shape=(None, n_features), name=\"X\")\n", "relus = [relu(X) for i in range(5)]\n", "output = tf.add_n(relus, name=\"output\")" ] }, { "cell_type": "code", "execution_count": 86, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "'tf_logs/run-relu1/'" ] }, "execution_count": 86, "metadata": {}, "output_type": "execute_result" } ], "source": [ "save_graph(run_id=\"relu1\")" ] }, { "cell_type": "code", "execution_count": 87, "metadata": { "scrolled": false }, "outputs": [ { "data": { "text/plain": [ "Reusing TensorBoard on port 6006 (pid 43590), started 0:04:52 ago. (Use '!kill 43590' to kill it.)" ] }, "metadata": {}, "output_type": "display_data" }, { "data": { "text/html": [ "\n", " \n", " \n", " " ], "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "%tensorboard --logdir {root_logdir}" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Even better using name scopes:" ] }, { "cell_type": "code", "execution_count": 88, "metadata": {}, "outputs": [], "source": [ "reset_graph()\n", "\n", "def relu(X):\n", " with tf.name_scope(\"relu\"):\n", " w_shape = (int(X.get_shape()[1]), 1) # not shown in the book\n", " w = tf.Variable(tf.random_normal(w_shape), name=\"weights\") # not shown\n", " b = tf.Variable(0.0, name=\"bias\") # not shown\n", " z = tf.add(tf.matmul(X, w), b, name=\"z\") # not shown\n", " return tf.maximum(z, 0., name=\"max\") # not shown" ] }, { "cell_type": "code", "execution_count": 89, "metadata": {}, "outputs": [], "source": [ "n_features = 3\n", "X = tf.placeholder(tf.float32, shape=(None, n_features), name=\"X\")\n", "relus = [relu(X) for i in range(5)]\n", "output = tf.add_n(relus, name=\"output\")" ] }, { "cell_type": "code", "execution_count": 90, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "'tf_logs/run-relu2/'" ] }, "execution_count": 90, "metadata": {}, "output_type": "execute_result" } ], "source": [ "save_graph(run_id=\"relu2\")" ] }, { "cell_type": "code", "execution_count": 91, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "Reusing TensorBoard on port 6006 (pid 43590), started 0:05:43 ago. (Use '!kill 43590' to kill it.)" ] }, "metadata": {}, "output_type": "display_data" }, { "data": { "text/html": [ "\n", " \n", " \n", " " ], "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "%tensorboard --logdir {root_logdir}" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Sharing Variables" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Sharing a `threshold` variable the classic way, by defining it outside of the `relu()` function then passing it as a parameter:" ] }, { "cell_type": "code", "execution_count": 92, "metadata": {}, "outputs": [], "source": [ "reset_graph()\n", "\n", "def relu(X, threshold):\n", " with tf.name_scope(\"relu\"):\n", " w_shape = (int(X.get_shape()[1]), 1) # not shown in the book\n", " w = tf.Variable(tf.random_normal(w_shape), name=\"weights\") # not shown\n", " b = tf.Variable(0.0, name=\"bias\") # not shown\n", " z = tf.add(tf.matmul(X, w), b, name=\"z\") # not shown\n", " return tf.maximum(z, threshold, name=\"max\")\n", "\n", "threshold = tf.Variable(0.0, name=\"threshold\")\n", "X = tf.placeholder(tf.float32, shape=(None, n_features), name=\"X\")\n", "relus = [relu(X, threshold) for i in range(5)]\n", "output = tf.add_n(relus, name=\"output\")" ] }, { "cell_type": "code", "execution_count": 93, "metadata": {}, "outputs": [], "source": [ "reset_graph()\n", "\n", "def relu(X):\n", " with tf.name_scope(\"relu\"):\n", " if not hasattr(relu, \"threshold\"):\n", " relu.threshold = tf.Variable(0.0, name=\"threshold\")\n", " w_shape = int(X.get_shape()[1]), 1 # not shown in the book\n", " w = tf.Variable(tf.random_normal(w_shape), name=\"weights\") # not shown\n", " b = tf.Variable(0.0, name=\"bias\") # not shown\n", " z = tf.add(tf.matmul(X, w), b, name=\"z\") # not shown\n", " return tf.maximum(z, relu.threshold, name=\"max\")" ] }, { "cell_type": "code", "execution_count": 94, "metadata": {}, "outputs": [], "source": [ "X = tf.placeholder(tf.float32, shape=(None, n_features), name=\"X\")\n", "relus = [relu(X) for i in range(5)]\n", "output = tf.add_n(relus, name=\"output\")" ] }, { "cell_type": "code", "execution_count": 95, "metadata": {}, "outputs": [], "source": [ "reset_graph()\n", "\n", "with tf.variable_scope(\"relu\"):\n", " threshold = tf.get_variable(\"threshold\", shape=(),\n", " initializer=tf.constant_initializer(0.0))" ] }, { "cell_type": "code", "execution_count": 96, "metadata": {}, "outputs": [], "source": [ "with tf.variable_scope(\"relu\", reuse=True):\n", " threshold = tf.get_variable(\"threshold\")" ] }, { "cell_type": "code", "execution_count": 97, "metadata": {}, "outputs": [], "source": [ "with tf.variable_scope(\"relu\") as scope:\n", " scope.reuse_variables()\n", " threshold = tf.get_variable(\"threshold\")" ] }, { "cell_type": "code", "execution_count": 98, "metadata": {}, "outputs": [], "source": [ "reset_graph()\n", "\n", "def relu(X):\n", " with tf.variable_scope(\"relu\", reuse=True):\n", " threshold = tf.get_variable(\"threshold\")\n", " w_shape = int(X.get_shape()[1]), 1 # not shown\n", " w = tf.Variable(tf.random_normal(w_shape), name=\"weights\") # not shown\n", " b = tf.Variable(0.0, name=\"bias\") # not shown\n", " z = tf.add(tf.matmul(X, w), b, name=\"z\") # not shown\n", " return tf.maximum(z, threshold, name=\"max\")\n", "\n", "X = tf.placeholder(tf.float32, shape=(None, n_features), name=\"X\")\n", "with tf.variable_scope(\"relu\"):\n", " threshold = tf.get_variable(\"threshold\", shape=(),\n", " initializer=tf.constant_initializer(0.0))\n", "relus = [relu(X) for relu_index in range(5)]\n", "output = tf.add_n(relus, name=\"output\")" ] }, { "cell_type": "code", "execution_count": 99, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "'tf_logs/run-relu6/'" ] }, "execution_count": 99, "metadata": {}, "output_type": "execute_result" } ], "source": [ "save_graph(run_id=\"relu6\")" ] }, { "cell_type": "code", "execution_count": 100, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "Reusing TensorBoard on port 6006 (pid 43590), started 0:06:21 ago. (Use '!kill 43590' to kill it.)" ] }, "metadata": {}, "output_type": "display_data" }, { "data": { "text/html": [ "\n", " \n", " \n", " " ], "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "%tensorboard --logdir {root_logdir}" ] }, { "cell_type": "code", "execution_count": 101, "metadata": {}, "outputs": [], "source": [ "reset_graph()\n", "\n", "def relu(X):\n", " with tf.variable_scope(\"relu\"):\n", " threshold = tf.get_variable(\"threshold\", shape=(), initializer=tf.constant_initializer(0.0))\n", " w_shape = (int(X.get_shape()[1]), 1)\n", " w = tf.Variable(tf.random_normal(w_shape), name=\"weights\")\n", " b = tf.Variable(0.0, name=\"bias\")\n", " z = tf.add(tf.matmul(X, w), b, name=\"z\")\n", " return tf.maximum(z, threshold, name=\"max\")\n", "\n", "X = tf.placeholder(tf.float32, shape=(None, n_features), name=\"X\")\n", "with tf.variable_scope(\"\", default_name=\"\") as scope:\n", " first_relu = relu(X) # create the shared variable\n", " scope.reuse_variables() # then reuse it\n", " relus = [first_relu] + [relu(X) for i in range(4)]\n", "output = tf.add_n(relus, name=\"output\")" ] }, { "cell_type": "code", "execution_count": 102, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "'tf_logs/run-relu8/'" ] }, "execution_count": 102, "metadata": {}, "output_type": "execute_result" } ], "source": [ "save_graph(run_id=\"relu8\")" ] }, { "cell_type": "code", "execution_count": 103, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "Reusing TensorBoard on port 6006 (pid 43590), started 0:06:45 ago. (Use '!kill 43590' to kill it.)" ] }, "metadata": {}, "output_type": "display_data" }, { "data": { "text/html": [ "\n", " \n", " \n", " " ], "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "%tensorboard --logdir {root_logdir}" ] }, { "cell_type": "code", "execution_count": 104, "metadata": {}, "outputs": [], "source": [ "reset_graph()\n", "\n", "def relu(X):\n", " threshold = tf.get_variable(\"threshold\", shape=(),\n", " initializer=tf.constant_initializer(0.0))\n", " w_shape = (int(X.get_shape()[1]), 1) # not shown in the book\n", " w = tf.Variable(tf.random_normal(w_shape), name=\"weights\") # not shown\n", " b = tf.Variable(0.0, name=\"bias\") # not shown\n", " z = tf.add(tf.matmul(X, w), b, name=\"z\") # not shown\n", " return tf.maximum(z, threshold, name=\"max\")\n", "\n", "X = tf.placeholder(tf.float32, shape=(None, n_features), name=\"X\")\n", "relus = []\n", "for relu_index in range(5):\n", " with tf.variable_scope(\"relu\", reuse=(relu_index >= 1)) as scope:\n", " relus.append(relu(X))\n", "output = tf.add_n(relus, name=\"output\")" ] }, { "cell_type": "code", "execution_count": 105, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "'tf_logs/run-relu9/'" ] }, "execution_count": 105, "metadata": {}, "output_type": "execute_result" } ], "source": [ "save_graph(run_id=\"relu9\")" ] }, { "cell_type": "code", "execution_count": 106, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "Reusing TensorBoard on port 6006 (pid 43590), started 0:07:06 ago. (Use '!kill 43590' to kill it.)" ] }, "metadata": {}, "output_type": "display_data" }, { "data": { "text/html": [ "\n", " \n", " \n", " " ], "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "%tensorboard --logdir {root_logdir}" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Extra material" ] }, { "cell_type": "code", "execution_count": 107, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "x0: my_scope/x\n", "x1: my_scope/x_1\n", "x2: my_scope/x_2\n", "x3: my_scope/x\n", "x4: my_scope_1/x\n", "x5: my_scope/x\n", "True\n" ] } ], "source": [ "reset_graph()\n", "\n", "with tf.variable_scope(\"my_scope\"):\n", " x0 = tf.get_variable(\"x\", shape=(), initializer=tf.constant_initializer(0.))\n", " x1 = tf.Variable(0., name=\"x\")\n", " x2 = tf.Variable(0., name=\"x\")\n", "\n", "with tf.variable_scope(\"my_scope\", reuse=True):\n", " x3 = tf.get_variable(\"x\")\n", " x4 = tf.Variable(0., name=\"x\")\n", "\n", "with tf.variable_scope(\"\", default_name=\"\", reuse=True):\n", " x5 = tf.get_variable(\"my_scope/x\")\n", "\n", "print(\"x0:\", x0.op.name)\n", "print(\"x1:\", x1.op.name)\n", "print(\"x2:\", x2.op.name)\n", "print(\"x3:\", x3.op.name)\n", "print(\"x4:\", x4.op.name)\n", "print(\"x5:\", x5.op.name)\n", "print(x0 is x3 and x3 is x5)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "The first `variable_scope()` block first creates the shared variable `x0`, named `my_scope/x`. For all operations other than shared variables (including non-shared variables), the variable scope acts like a regular name scope, which is why the two variables `x1` and `x2` have a name with a prefix `my_scope/`. Note however that TensorFlow makes their names unique by adding an index: `my_scope/x_1` and `my_scope/x_2`.\n", "\n", "The second `variable_scope()` block reuses the shared variables in scope `my_scope`, which is why `x0 is x3`. Once again, for all operations other than shared variables it acts as a named scope, and since it's a separate block from the first one, the name of the scope is made unique by TensorFlow (`my_scope_1`) and thus the variable `x4` is named `my_scope_1/x`.\n", "\n", "The third block shows another way to get a handle on the shared variable `my_scope/x` by creating a `variable_scope()` at the root scope (whose name is an empty string), then calling `get_variable()` with the full name of the shared variable (i.e. `\"my_scope/x\"`)." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Strings" ] }, { "cell_type": "code", "execution_count": 108, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "[b'Do' b'you' b'want' b'some' b'caf\\xc3\\xa9?']\n" ] } ], "source": [ "reset_graph()\n", "\n", "text = np.array(\"Do you want some café?\".split())\n", "text_tensor = tf.constant(text)\n", "\n", "with tf.Session() as sess:\n", " print(text_tensor.eval())" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Autodiff" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Note: the autodiff content was moved to the [extra_autodiff.ipynb](extra_autodiff.ipynb) notebook." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Exercise solutions" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## 1. to 11." ] }, { "cell_type": "markdown", "metadata": { "collapsed": true }, "source": [ "See appendix A." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## 12. Logistic Regression with Mini-Batch Gradient Descent using TensorFlow" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "First, let's create the moons dataset using Scikit-Learn's `make_moons()` function:" ] }, { "cell_type": "code", "execution_count": 109, "metadata": {}, "outputs": [], "source": [ "from sklearn.datasets import make_moons\n", "\n", "m = 1000\n", "X_moons, y_moons = make_moons(m, noise=0.1, random_state=42)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Let's take a peek at the dataset:" ] }, { "cell_type": "code", "execution_count": 110, "metadata": {}, "outputs": [ { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAAYcAAAEFCAYAAAAIZiutAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAIABJREFUeJzsnXd4FVX6xz9z0yskIBA6EhBio2hQFASRIl1AROAniKsu\nSkewgnWVtqCuuq67i7qCJQEpQhJssIKygBiKCgghoZcgIGmk3Pv+/jgzk7k3NwUSSID5Ps99kjsz\n98yZmTPnPW/7vpqIYMOGDRs2bFjhqOwO2LBhw4aNqgdbONiwYcOGjSKwhYMNGzZs2CgCWzjYsGHD\nho0isIWDDRs2bNgoAls42LBhw4aNIrCFgw0bNmzYKIIKFQ6apj2uadomTdPOapo2v4TjRmiaVqBp\n2hlN0zL0vx0rsi82bNiwYeP84VvB7R0CXga6A0GlHPuDiNgCwYYNGzaqICpUOIjIUgBN024G6lVk\n2zZs2LBh4+KhMn0OrTVNO65p2k5N057TNM32f9iwYcNGFUFFm5XKiv8C14nIPk3TrgXigHxgZiX1\nx4YNGzZsWFApq3URSRORffr/vwAvAYMqoy82bNiwYaMoKktz8AbN60ZNs2ljbdiwYeM8ICJe59Wy\noKJDWX00TQsEfABfTdMCNE3z8XJcD03Taun/twCeA5YW166IVPnP888/X+l9sPtp99Hup91P41Ne\nVLRZ6TkgG3gSGKb//6ymaQ30fIb6+nFdgG2apmUAK4BFwGsV3BcbNmzYsHGeqOhQ1heBF4vZHWY5\nbgowpSLPbcOGDRs2Kg52+GgFoVOnTpXdhTLB7mfF4VLoI9j9rGhcKv0sL7SKsE1dSGiaJlW9jzZs\n2LBR1aBpGlIOh3RVilayYcPGJYbGjRuzb9++yu7GFY1GjRqRlpZW4e3amoMNGzbOG/rqtLK7cUWj\nuGdQXs3B9jnYsGHDho0isIWDDRs2bNgoAls42LBhw4aNIrCFgw0bNmyUggMHDhAeHl6ifyUsLOyC\nOIYrC7ZwsGHDxmWJxo0bExwcTHh4OFFRUYwaNYrs7OzzaqtBgwacOXMGTVP+3c6dOzN/vnuxy4yM\nDBo3blzeblcZ2MLBhg0bFY7UtFSGjxtO55GdGT5uOKlpqRe9DU3TWLlyJWfOnOGnn35i06ZNvPLK\nK+fcjysVtnCwYcNGhSI1LZWuY7qyMGwha5qsYWHYQrqO6XpOk3tFtAGYZqCoqCjuvvtufv75Z44c\nOULfvn2pUaMGzZs351//+pd5/KZNm7j55pupVq0aUVFRPPHEEwDs27cPh8OBy+XiueeeY+3atYwZ\nM4bw8HDGjRsHgMPhYO/evWzYsIGoqCg3E9SSJUu48cYbzT7NmDGD6OhorrrqKoYMGcLp06fP6bou\nBmzhYMOGjQrFtLnTSLkxBfz1Df6QcmMK0+ZOu6htWHHgwAESEhJo3bo1999/Pw0bNuTo0aPEx8fz\nzDPPsHr1agDGjx/PhAkT+OOPP0hJSWHw4MFmG4ZJ6ZVXXqFDhw689dZbnDlzhjfffNNtf7t27QgN\nDeXbb781f/vJJ58wfPhwAN544w2WL1/O2rVrOXz4MBERETz22GPndV0XErZwsGHDRoXi0JlDhZO6\nAX84fObwRW0DoH///kRGRtKxY0c6d+7Mww8/zPfff8+sWbPw8/Pjxhtv5E9/+hMfffQRAH5+fuzZ\ns4fff/+d4OBgYmNjy3wuq6YwZMgQPv74Y0D5IhISErj//vsBeO+99/jLX/5CVFQUfn5+TJ8+nUWL\nFuFyuc7p2i40bOFgw4aNCkW98HqQ57ExD+qG172obQAsW7aMkydPkpqayt/+9jcOHz5MZGQkwcHB\n5jGNGjXi0KFDAMyfP59du3bRokUL2rVrx8qVK8/pfAaGDh3KkiVLyM/P5/PPP6dt27bUr68qFuzb\nt4977rmHyMhIIiMjiYmJwc/Pj2PHjp3XuS4UbOFgw4aNCsXLk16m6damhZN7HjTd2pSXJ718UdsA\nioSe1q1bl5MnT5KVlWVu279/P/Xq1QOgadOmfPzxx6SnpzN16lQGDRpETk5OkXYNE1JxaNmyJY0a\nNSIhIYFPPvmEoUOHmvsaNmxIYmIiJ0+e5OTJk5w6dYqsrCyioqLO6douNGzhYMOGjQpFk8ZN+Oqt\nrxiWMYzOqZ0ZljGMr976iiaNm1zUNryhfv36tG/fnqeffprc3Fy2bdvGv//9b9MfsHDhQk6cOAFA\ntWrV0DQNHx9VzNIqaGrXrs3evXtLPNfQoUN58803Wbt2Lffee6+5/dFHH+WZZ55h//79AKSnp7N8\n+fJyXdcFQWWXsitDqTuxYcNG1URVfj+bNGki33zzTZHthw4dkt69e0tkZKRER0fLe++9Z+4bPny4\n1KpVS8LCwuS6666T5cuXi4hIWlqaOBwOcTqdIiKyfv16ad68uURGRsr48eNFRMThcEhKSorZ1v79\n+8XHx0f69Onjdn6XyyXz5s2Ta665RsLDwyU6OlqeffbZ877O4p6Bvv28516bldWGDRvnDZuVtfJh\ns7LasHGBICLMeuope5KzYcMCWzjYuOKxavFijrzzDl9+/nlld8WGjSoDWzjYuKIhIqyaM4e5GRkk\nzZ5taw82bOiwhYONKxqrFi+mx/btaED37dtt7cGGDR22cLBxxcLQGrrpTJ3ds7Nt7cGGDR22cLBR\nJVCRTmHPtopr26o1ALb2YMOGBb6V3QEbNsDiFL75ZroPHFihbXl+FxFmP/00nD1L1k03sd6S7Soi\nhK5bV+4+2LBxyaM8SRIX40MVTrK5lOByuWTmk0+Ky+Wq7K4Ugcvlkgnt2okL1N9y9NGzLafTWaTt\nxPh4mRAWJkmLFlXgVVyZsN/Pykdxz4ByJsHZZqUrBFUtXFMspp5VixfTXTfvdCunWcfTwTxz6lS3\n76sWL7ajk2xcdPTs2dNkfr1kUB7JcjE+2CuTcqOiV+YVoYEYq/fE+HizbwLiAhl3883SX1/1n2vf\nPNvqHxIiTsv3kc2aSWJQkAhIYnDwOWkPVVn7qixU5fezUaNGUrt2bcnOzja3/etf/5JOnTpd0PO+\n8MIL8n//938X9BxWFPcMsDUHG6WhIsM1K0IDESnMLXj/6adNrQGUUzj7p5+ovWEDs5588pz75tnW\nQ1lZfGX53j8lBXSWzXONTqpq2ldVh0j5gwzK04amaTidTl5//fUi222UAeWRLBfjQxVemVwK8Laa\nPl/toaI0kMT4eEkKDhYBeczXV0bWry9jNU2eB3kc5G5NK1z1l6I9WFfzsyZMkDExMTLM4ZCxMTHy\nYL168kB4uNwdGSnDHA4ZWb++jNE0maXfi3PRHipS+7qcUNL7WRG+nfK00bhxY5k5c6bUqFFD/vjj\nDxFRmkPnzp1FRGTHjh3StWtXiYyMlBYtWkhcXJz5299//1169+4t4eHhEhsbK88995zcfvvt5v7x\n48dLgwYNJDw8XG666SZZu3atiIgkJSWJv7+/+Pv7S2hoqLRq1UpERDp16iT//ve/JTc3V6pXry6/\n/PKL2VZ6eroEBQVJenq6iIh88cUX0qpVK6levbrcdtttsm3bthKvs7hnQDk1h0qf/EvtoC0cygXr\nRHyuE2JJbXlrw5vZxXNbaaafv4As1/9fBtL/5ptLnIitk4e3Cdy6rWft2jKtY0d5/o47zM/0jh1l\n1oQJ5b72KxXFvZ8VIUzL20bjxo3lm2++kYEDB8pzzz0nIoXCISsrSxo0aCAffvihuFwuSU5Olpo1\na8qvv/4qIiL33Xef3H///XL27Fn59ddfpUGDBtKhQwez7YULF8qpU6fE6XTK3LlzpU6dOpKbmysi\n3s1KhnAQEXnooYfM/oiIvP3223L33XeLiMjmzZulVq1asmnTJnG5XPKf//xHGjduLHl5ecVepy0c\nbJSI4uzhsyZMkOnnOSF6tl+aBuJtlee5zZuw+gIkCcQJ0l9v2zhHX5AEy4quuD5NaNdOEuLiikzg\nFTGpV6T2dbmhuPezIu57edswhMPPP/8s1atXlxMnTpjC4bPPPpOOHTu6Hf/oo4/KSy+9JE6nU/z8\n/GT37t3mvueee85NOHgiIiLCXOGXJhy+/vprufrqq819t912myxYsEBEREaPHi3Tp093++0111wj\n3333XbHntoXDFYqyOkEvdHhmaRqIy+WS8foEOt7Lqn1wVJQ4nc4iwsow/dxTr570ql/f1BqMTwJI\nrzp1xOl0yownn5QZU6ea98Lap4TgYBkcHe02gY+PjTX7VJ5JvSK1r8sN3t7PihCmFdGGIRxEVJ2G\nyZMnm8Jh1qxZ4u/vLxERERIRESHVq1eXsLAwefzxx+Xo0aOiaZrk5OSYbf3jH/9wEw5z5syRli1b\nSvXq1aV69eri4+Mj3377rYiULhxcLpfUrVtXNm7cKPv27ZPQ0FDJzMwUEZGePXtKSEiIW79CQkLk\n008/LfY6L5RwsJPgqjjKkhwmUujgnTR7Nt0GDKhwp9v2778n46ab+Cg5maYZGaSEhXF169aE6Qlj\nqxYvpuuWLWhAl+RkulxzDU+88orpCB965AiznnySp+bNQ0QloU157TW3ft7XujUfHj7MZy4XEYAA\nNYD6x44xc+pUDr/zDsdFWHXzzWzdtIkj//0v83Tqix7Z2cSlpJhtaUDtLVu4HpgNTME9A7q0JDdr\nH7d//z2ZdrJcmVFS5nlZ71dFtGHFCy+8QJs2bZg8eTKgSnV26tSJVatWFTnW5XLh5+fHwYMHiY6O\nBuDAgQPm/rVr1zJr1ixWr15NTEwMAJGRkcZittR3T9M0Bg8ezMcff0zt2rXp3bs3ISEhADRo0IBn\nn32Wp59++pyvscJRHslyMT5cwZpDWW2uF8seXtx5rFqDsdp/RF/xF/EtOJ1etRyXyyWP9usnr/r7\nSyLIBA/zkuGXGA8yomlTuRZkRUCA22p+qcMhD8bEmFpJ/3r1pEtAgPzZx0d616gh0zt2PCcfg50o\nVzq8vZ8VYcqsiDasmoOIyMMPPyw1atSQzp07S0ZGhjRq1Eg++ugjyc/Pl7y8PNm0aZPs3LlTRESG\nDBkiw4YNk+zsbNmxY4c0bNjQ1BwSEhKkXr16cvToUcnNzZUXX3xRfH19zXO9++670qFDB7f31ao5\niIhs2LBBoqKi5PrrrzerzYmI/Pjjj9KwYUPZsGGDiIhkZmbKypUrTc3CG4qbI6lKZiXgcWATcBaY\nX8qxE4EjwCngX4BfMccVe1Mud5Rl0r9Y9nBv5xnfrp3MmDpVEuLizInaZZnYH7BM8IZv4dXJk70K\nvMT4eLnb11f6BQZKgu6D8OaXSAR5AeR2kP5+fuYEMr1jR3mwQQOZqZdsFBFJiIuTfpomM0D6aZok\nxsef87WW5V5eyfkPVfn99CwTeuDAAQkKCpI777xTRER+++036dWrl1x11VVSs2ZN6dKli2zdulVE\nVARRr169pFq1ahIbGytPPfWU3HXXXSIi4nQ65aGHHpLw8HCpW7euzJ492+1cv//+u9x+++0SEREh\nbdu2FRGRzp07uwkHEZHo6GipWbOm5Ofnu21ftWqV3HzzzRIRESF169aVwYMHXxbCoT/QF3i7JOEA\ndNcFQwugGrAaeLWYY4u9KZczyjrpJ8bHS2JQkMy0TMQXQnvw6kgOCJDBgYHyWM+eMigwUMaCDAJZ\nqu9fAtLNz0+mgzwPMg2kS/XqRZLQjGtNAFmqaXKPfqwR2jrIz0+mgcwyhBLIOJDeFDqrPVf6LpdL\nBkdHy6u6sHoV5LqAACkoKDinay3pXhpCISEu7orVMq6U9/PJJ5+UkSNHVnY3vOKSEA5mo/ByKcJh\nIfCK5fudwJFiji3XjbtUUVYn6KwJE2Rky5Yy2sfHNKmcTzSSgdKinqZ37Ci9GzSQaR06yNCwMJkJ\nMjI6WkaAFFA02miEh/aw1OGQRA+BZ0QZzQIZqJukOoJM13//uYcWkahP9q+A3FWrlsyYOlXGx8a6\nCdCEuDgZoQsSQ6D0AXm4d+8Sr33G1KlFnNiGM93bMxofGiojmzW7YiOYLtf3c+fOnWb00YYNG6Rm\nzZpu5p+qhMtNOGwB7rV8rwE4gQgvx5bvzl2iKKvNtaKTszzzBjwFRUJcnHT185O/TJpkCq8lDoe8\npE/YKyyT6kx9Yr+3fn15/o47ZGxMjIzRNJmp73OBJAQFmZOryzKZ99eFwEyQfrqJ6gGQwfrfe/R9\nd4A84nDIq/7+pgBNjI+XwdHR8orFPGUIlNuhWO0hMT5eBgcGFvFlLEGZw7zd9wSQZQ6H27mvJBPT\n5fp+btq0SaKjoyUkJESaNGkiM2fOrOwuFYvLTTjsAbpZvvsCLqChl2PLd+cuc5yrCcRz0rJu95Y3\n4M1UMx6kU0CAG2dRv8BAiQW5V9NksMMhrXQNoFdQkAzWs0QNgTcyJsbUdEa2bCnLfX3FBfKoPokb\nTu0ugYEyomXLwskXd1+EVTMxhIrBn/Rnh0N6eWgxA0E6FaM9GNc+E2RoWJipJd0XGGiawzxzOhKD\ngmQCKj9jpv53cHS0jL+CTEz2+1n5uNyEwxZgkOV7ZEmaw/PPP29+Vq9eXa4beTnhXJzRxWkE1u1u\neQOWFb3VVDNSp7YYYZnIXSAP6RP6+NhYebhvX/mzbj7y7I+nAJo5frxM79hRRrRsKf0oapJ6tGdP\nmd6xo4yNiZEeFg3iQX2yX2LRDAzBsczXVzpERJhajNWpPRikg8MhTqezyH3wFLIJcXGmYFrqcJgO\nbavWkKRf92j9r3F/rhQTky0cKh/GM1i9erXbXFle4aCptisWmqa9DNQTkVHF7F8I7BWRafr3O4EF\nIlLXy7FyIfp4OSBp0SK0ESPorsf6AyQFB6P95z9useAiwqRbb2Xuhg1MateObpMn8+VDD9F9/nyS\nZs8mauNGDsfG4nK5OPjjjywGVgH5Dgd9XS6SgoPhww95/+mneXDPHnoAicBfAwO5LTaWX/fupcbB\ng2hAAz8/vs/PZwUwCegWFITjo4/M/lj7bPS124ABDGne3GzbQCLwQXQ0bQYMQMvNJSs52YwhFxG2\nbtzIkrNn0QABBgQGckNsLCLCd5s30zE7m9PASSAb8AFuBn4Frpk8mda33MKqUaPoPn++yhPZsMFs\na2JsLH+cPMn8PXvMbaOaNWP+rl2sWrwYbcQItmVnk4Fa6SwDevj48JjLRT8Rr8/hcoSmadjvZ+Wi\nuGegbz/vhKcKFQ6apvkAfsB0oD7wMFAgIk6P47oD7wNdgKPAIuB/IvKslzZt4VAMZk+cSOZPP7kl\n3YgIoW3aMGXePHObdUJODAoirn595u/ezahmzWiWlkZ6fj41fX3ZUFBAPaAhKnwsCVVHVoD7mjYl\nZO9e5osUTpaaxr2ffso/R43i86wsJgKHgP9Dhawl6edf1a4dc9evBzCFlNGGIaxWDB1KzYICTqIS\nniL1/UeB04GBjFqwwG2iTYyPJ3/IEPq6XOa2ZQ4HAZ99pr7o1zsJmAv0AR4C7gGWAm83asR1derw\n1w0b6BQRwdM5Odx99qzZ1ksOBzeK0M8y9oz2t3//vXnf9x0/Tv8dO+gHLAECgbst1zZ3/frLmgXU\nFg6VjwslHCranPQ8ynfgtHymAw2ADKC+5dgJ6O8+dp5DEZQ1dt6IsJnx5JPidDq9Et9N8EhQs5pK\nBusmnLEg3fX/u4HcaTEbCcg9miYP4R55tETT5M7GjU3zzXIUF5KbmUs3UXmarYzPysBA6RAR4ea/\nGIiKVJoGEgsyg0JKDgOP9ewpY3QmV+MzRtPksV69THbWQbrZyTPnwlrXIREVLturXj3T8T82JkY6\ngtxvaXs6SBeQ0T17ut1bw/Rmtms5T3lCii+V3IlGjRoJShban0r6NGrUyOuz0efO857PK5Q+Q0Re\nBF4sZneYx7GvA68Xc+wVj7LWVF61eDGb33iDkwUFuPLzOf7Pf/LlzTfTbcAAZj/9NNe3bWvSEAjw\nJTBXX233dblYqrdzFhiLWrV3AA4DrwFzIiMJDA4mKyuLq06dYhRKszgA5IaE4Dx+nJ56G77Ag3ob\n6H87ORzENW5MzLp1AG40FCLCmp9+4oZTp/gKlfyiAX/S/wrKJCTAXVu2uFEnNG7enMzMTLCsyiNF\nCG3WjCnz5jFrwgT27t5Nv/x8VgH3e/Srf0oK4nLxJfA3YMDp00z/9ls0TWPSrbfSC9gaFoa0bo2m\naexPT6fZrl2I5XxJixZxT0qKW7v9gE41atDp2msBzptioyJral9IpKWlVXYXbFwgXBCfQ0XiSjQr\niYePoDjThIgw8ZZbOLxxI6cAP39/VublmaaaVaNGsbNWLWLr18ehaexLT2fAzp30sZhiEvW/7wFG\nCZuJwDzgPlSG4h8BARAZyWdHjig/AkrIZPn40M/Xl165uYDiMNqHmtCb6X931KhBt+HDmfp60XVA\nYnw8bw4eTAJwb1gYOJ1oOTn4iZAD5AErgCHAJygzzbwymmkS4+PJHTyY/nq/9uv9OQv4o8w/wShz\nU3fgUaCJ7ofw9Il0vece7q9fn0+PHHF7Ho/36oUjMZEalvG5DziiaUyIi6PHoEGl9tMbyvr8bdgo\nCeU1K9nEe1UQ3iq3eVs9rlq8mFrJyewBrgFq5eWZdZg/eOYZRmRmkp2Vxa0zZ3L3oEHMnjiRH2vW\nZLOmcTI9HXbsIEKEXylcrSehbOYaarUdBLyfm8vQI0dMTeCv/v58dfYsnRwOFtWowWIfH3C5cGZk\ncDY/nzNnz3KgZUtEhOZ79nBjhw5F+i4ifPDMM4zRzzUyLw/fBQtYNWcOf92wgYHAY/q+B4GvKKo9\neIOIIsxL0+/fFiAdNWl/gRIEScDgsDDOuFxszcpCUIJi9b/+xbG1a00yv+7Z2UyaPZvNP/zAMP36\nrc+jcfPmZGRksDktjdaNG5Oclobj+HESc3N58Omn2bppEwBTZ8w4p8m9rM/fho0LivLYpC7GR3Xx\nykFZw1NdLpeMj42Vey12brdYf00zcwDu1TN8rTZswy4/VtOkA8i9KFt/T9xt8+NRiWNudZj1cM3E\n4GDFleSl2I6VLttb/61hsUa7fSMiJEHnVupB0RyFgb6+btxJxn2wXpdn1rKg+1n0/w2OpuX+/tJJ\nvz7D7zJC0+RVP78iPpE+AQFeOaWsocCvTp4sjwcFyUv6cS+CjPb3l8GBgfJo//5l5mdyOp127Qgb\nFQLK6XOo9Mm/1A5eYcKhrLQZifHx8hc/P+lFYVZyIoVZxcvBnKiWgPxl0qQiCW2ek1CHatVkqaa5\nnXsZii/JmnxmxPdb2VI9i+286u9vZhp79t9IpvPMQ1gC0qtuXbnDx6fIvkR9wn3tiSeK3AdP4bQS\n5E8WQenJ8DpBv2eG89wQoktAOkdGumWl96pXr0hflvn5yeDAQHVu/R5aWWONv4ZwLQvpn1XI2LUj\nbFQEyiscbJ9DFUNZw1NnT5xIwgcf0PD0aT6g0IHbGcVmmIKKAFisb+/j58eK/HwmtmtH1B13cMNN\nN+EYOdItR+IxHx8O+/oS6HBQMyeHGqicgJuB9UAAUBPFdRKKqpGwAhW77LKEyEKh38IasmrYzpMW\nLeKLIUM47XQSbTlmD5AWFMT1OTnkAbkov4N/UBC5OTk4gbzatVlx5AgAs55+miOrVzNv40bTz+IY\nOZKl2dkIkBYSgo/LxSM5OfSz3ONEVHjrZDBzNhwoX8pEi19DROgdFUXbY8fMkN4UIEfTaCfC5uho\nHjx8mB7Z2eZ9yAf+jTLT9UKZsJKBLdHRfPrbb255Gka9CCgM8R1Spw7XNGuGw+Eo8fnbsFEaqlSe\nw4XAlSYczgWje/Xi7qQkM9ZfgJHABygh0YrCmHsnarJaERDAR5pGet263F6vntsktO/4cYL27EHr\n1o2cLVs4dvKkObEuBf4CXA/khYURUb8+R3bsIAYlhK7DkjSHXpzF0ldrUtjsiRP5ddUqBuzaVcQ5\n/jbKN2AmowGHatdm+LFj9LO0IyKsfOAB6jmdPJWXZ+Zv/Hv3biajJv8BISHc+qc/kZWczIGUFJwZ\nGWiaRnZuLrlnz7LUch4jH2JlQAB+CxfSbcAARt9zD/2/+ooeFgG6AtgKPIPK87DmfUwC/gr01q/h\nr8AT+vbTwJD4eNNJnbRokZmAt3TBAvM8V0rynI0LD1s4XMGwahkn09PZ/euv/FnT6CtiColBqMS0\nHyhMaJsIpGkaf7ZE1IioyKeojRs5EhvL3PXrmdy+vXvCGmoCTQoK4u1q1Rhz9KiZ0Twb2K9p0LIl\nW3bu5GqXixyHg9CoKBpGRxdZ/Rp9B0jRq8udRCXJvG25xiRgIxCLWuULanUvIry+caOpoSShhJO/\ny2UKphXAz088wVOzZ5vtuVwuuteoweTTp92ysZcC/woKolZkJC0GDWLvvn0cWLGCq5s3BxF+37GD\nGqiIpxuBG/TfWdswhGI+sB04rvdjKSr8d6uuPUChpvBgdDR/7N3L53q/PbUsGzbOF7ZwsGFO7Gzc\nyDwU9QWoieslVKZhPwpX8knAT8CK6tVZ9/vvOBwOkhYtInnYMI7n5VHL3x/GjqXN3//uZnZa7nCw\ntEULRITAnTupJcIpXSBEXnUVIsKvGRnUSU6mNyBlWAVbs7dnA5moCfgoEKMfE4Iq/jEXNfmuCAhg\nq9PJswUF5oS8DRWRdCgwkM8tlBpD6tTh08OHzYn2tcmT2Td3LrUpzHv4XdM4ERlJaz3kNjE+nuVD\nhtDP5SLJQqMxChXe2wPMvu5GCbRAIAcVGXUVSlNIAnqiBMNuoKWvL30//VTZc/Vrfgloi9LqzHti\naw82KgB2KKsNVi1eTO0tW2hjfEdNpEYS2duoPIL/iBCdmQmoSeyG06eZOXUqT82eTdLs2ZCXxzxg\nYl4euxcsIMdL3eQWrVvzdVwcSSJKExFhUlgYz69ejYgwMDycN9FNLNnZTC6lprVRm/pfGzcSdfYs\nkfr2WrhnUy4HBqPb9TWNmwsKEJSJZ1NoKNe2aUNWejqjdu92S0p78MwZMxRURFi3cCFtgV+AujEx\nRF51FZnHjxOhh9yKCHHPPst8l4tJQK3kZDo7nWaC21vBwXzq54eI4BMWhvP332l79ixTUBrZp/r9\nz9fP/wh5ydxZAAAgAElEQVQqHPjfYWHUaNWKrWvXcuR//2OuLnQDgY+BJE1DswhZuz61jcqGo/RD\nbFwoiAiznnqK4jSj0vYbx6yaMwefvDy+B7oCnVAT0yoKcxYeOHuW6/LyeAEV0+8P/B1Y99ZbJMbH\nU3vLFvPYHsDtp05x27hxvLBmjfl58b//xenjwzVHjvCVfn5rLP7MKVP4U1aWadb5yrKvOEyZN4/2\nY8dSq6CACOCkprEjJETZ6IExwAvAjyjyvIWApk/Gq1AaRtu8PNqPG0dM9+782L49L9xxh/lZf9NN\nbNOzs1ctXszYjAxeQvEsHcvLY/q331I9PJy38/NJmj2bxPh4M+u5G7A1P9/06fQDml1/Pe+fOsWH\nf/zB4NdfZ3hBAVMt9xr9/976//3176MKCrht/HhuuP12M4cBYCqKi6qXCI7QUILatSP4lluYNGcO\nA265BZfFJ1OW8WDDRkXBNitVIgynZI/33/e6Sixtv3GMSaqHcoK2R0X+7MXd13BPYCC+Z89SE8WK\n+BzKHv5R69Zoe/YQn5FhmmPuDQsjdtQot8xmEWFI3bp8evQog8PCiNGpJUSEkNat2fzZZ3x69KjZ\nxuCwMFq2akVY27bFRtqIiBsZnwsYGBLC51lZzEERcu0NC8Ovfn3Tgb0UWNayJZkHDxKXkeG1r57n\nsEY2WYkDm0+cSJt331X3LziYt8PD+cJyDaOA+RSaoAyG2m0//ghnz5KZnMze5GSyMzIIA06g2CZ/\nQ0VzGYmFoAgIo265hazkZE6dOAE7dhApgqCivwr8/TkA/O5wUO2uu/BZsYJGFp9JWcaDDRsGqhTx\n3oX4cJnmOXir4FZS4R1vpHoi7hXjOkdEmPWbH0QRylnzE5b5+kqXwMAiCXN9q1Uz6zqXlltRXGGh\nsuZneMLzd4lQJK/AWlvCyFUYHB0tiWUocmScY0xQkFktzkxwA+npkeBm5Dy45V4EBcmDOjHf9I4d\n5bGePb3WwBBwK3U6iELSvlkgCZZ+elb6m9ahg/Tw8xMXqj723XqCYP+QELP2REVW/LNx+QM7z+HS\nhLe6BiJirgzF4rRMCg4mefRojr/3ntdVo8vlYmD79jTUI3hAmWQ+Be4FAkJDiW7ThpMnTuDcuZN3\nPMJNl/v6sqRZMxrVqmW2KR7RRWKJZpqqH2ONqilrfoYnPH+3bc8ewjIy8AkLo2F0NKBCbAfs3k2f\nggLVLl7CSIuJ8BGLZnJPYCANzp6lhr5vH8rsY82BWOZwsKxFCxpedVWx98HKe2RoAgA/pabSqlEj\nkn/4gS9cLu4FrkX5fX4HasTE0KBrVwgMZMprr5la1+ynn6YgL49r582jH/AKKiKqD4URV63atSsy\nXmztwUZJsDWHSxDespPHt2sn42Nji1BPGPv7h4RIAUhXnQrDqmW8OmmSPALyiq+vufo2NIYvUNnK\n1mxeo81BlnKYRm1qb1TRLpdLHu3XT17195fxqHKeBn1GRWXulkRR7bnKHhkTY9KOl6alWFf2y3x9\nZWTLlmY799SrJw+Eh8v9EREyzOGQsTExxdbpLqlinHF/xoeFycO9e8tyQ+PRn4WRoT2yWbMipVcT\n4+NlfFiYdNI1GKtWZ3324/SxYVNq2CgrKKfmUOmTf6kdvAyFgzcTzBcBAfKav7+aDByOIkXuvwB5\nFWQMikLCoFtIiIuT/iEh4gLp5XDIc7ffLgPDworwI93btKlphilpQk2Ii5Oufn5udA+J8fFyt6+v\n9NNNUv1ARrZs6XUiLc89mVDG2suewsIw93ib1EvjKSqLucbomzcBa5RF7e9wiBPkbofDbX83CkuZ\nLvXxcSu9auVRMsxZnnWyDQqTFz222ZQaNkpDeYWDbVaqBHiaUkSEPcnJ3JiRwQ2o+gLBgYHUadKE\nU7t2kevvT4uzZ/kNZSq628+P5jfeyBs//ki32rWZcOyYyn4GlvfpQ78vvzRptEE5RP/lcBDVogU1\ndHOJcV5Pk8mQ5s2J2rOHIx4JW902bDAzoI3ynVY6iOIgUkgTUdyxIheGorosZVS9mfeKK7E6qlkz\n7jt4kB45Oeb+V/382JSfz59QZUiNTHQDS1HhqgZNh7X0avLo0bT++9/pkZ3NMlSp0QMomnQNKAAC\n/P3Jczo54XTSSg+9NfplU2rYKAm2WekygKFJOEG6oojbRuhmCIMFNcGyolyCqpBWANLHwwTR3uGQ\nASEh8kB4uAwNCpLBIPcHBcltoaFFGE09YWVKHamTxSXGx0tiUFAR8rqRZSCTM67Nm0ZQmqmmIlCa\nhlEWzaIks9T0jh2ld0CAaQaahQoCeBy9Mh2KMXcWRQkAnbq5yFPDm0Fh5TzPPtqmJBvnAmyz0qUP\nYxLrVb++GWH0IqqMp6DKcfbRTTrGRNId5CEw7dtW81MXh0Me7d/f9GEMjo6W8aWYbAymVKM0aIJu\nihoXGyuP4F4y1Ng/ODq6xMmqpImtJFPNxZoES4uwsvbfpU/c42Njzb4lxMXJixah7Y0BdqAuLLr4\n+bn5SbxFZS3RNBmgaTKyZUuZOX68zHzySTemW9uUZONcUF7hYJuVKhgipZtRvMHlcjEwPJzPs7KY\nqG8zWE2t5ggDK4BpQFNURMwvmoZD08DlIgNopGk08vXlqfx8M7KnJJNNYnw8cffd5xYB1Ae41deX\nHwoKqIUi2DMymAU4odNBlCUHw2qyESnZVHOxonFKi7By6z8qma2Wvz9tPv6YbgMGMOqaa2i5ezdZ\n+m83An/GPfppObAEOKppRLdsaZr1rFFZIeHhHNyxA//AQNqePcsRo5LfQw9xpHZtPt2zx+ZdsnHO\nsOkzqhjOt/avNbu4NorYzXiqPwP7RPgoMJAYnTfIBYQDw4B7gKWaRoKmMQCdPkOEUfn5JAH36RN+\n123bGD1gAH///PMik8vyDz7gHk1D0wWxBtQF1ouwAujl54czKIg/wAwzrSnCtmJoHkRU5vZcj6pq\n3QYMcKt0dk9qKp81a8b/PMNoLwJ9RGn2+u3ff0/mTTfxA/BrcrJKuAsIwGftWkSEAamp9LEc/2fg\n46AgFjgcBPv4qJcTyAgNJbZpU0LbtOGJuXOZ/fTTLF69GlC045sXLcIfGJqbSz8gUa/k92lGBqN0\nuhNwz0a3w1htXGjYwqECYU6IGRnmRFiWFZ6IsOXjj3lK/+4DvI/i3JHISJpffz1aejq1fvuNF1Ck\nb1NQq9IA/Tf9XC4+RAmNHhQWu49DZfgCaDk5aMuXq8nZo77x1c2bszkzk58s/c0/fpxHda6icX5+\naPPnl3lSsgoAKJzYVi1e7CY0+hQUsDo8nOdXr65yq2FDeCQtWkT7ESPQgD85nWgdOrBt3Toy27dn\ns6XPtUVoWoqTOGnRInPx4HK5+GrOHPydTpoDfXXB3CM7m7iUFAD6axoPtWzpnndh8y7ZuBgoj03q\nYnyo4j6HinCserV96x/D/j5rwgR5vGVLGQgyGlXWc4zu7LSGPBolLw2/gBFGabWHj2zWrFSbvtPp\nlMFRUeftCyjOGfznnj0vqUpnZS3beq5tjY+NVb4gkA7F+HSManvWsqQ2bJQV2D6HyoW1aMuqOXOY\nu2EDALOAI7GxzPvf/0pdERu275Pp6ezcsYP2+vZQ4HqL/f2xnj3Zn5jIFygzT0hBAcEi+AANULTV\ntYB/GO0CaYBPTAwZwICdO+njcrHc15eAEnwFoKitr507l77Wa60AX8D5ZlJXFsoSDns+bb3q58dv\n+fm8jyJLvApojmKZrY9HtT29QNOoBQtsjcFGmWHXc6hESDGOVcN5eZWfH/t79fJq4/eG0b16wapV\n5F5zDQ116uYDqam0GDCAKfPmcV+zZkSmpPB3VMWyn0V4Ki/P/L1BSOdJgxHSurWiibYW7inBsSlS\nWB5zb1gYTS0Ee1V1Er9QqChhZh0roOhNHkSZAFegTH+PoHIdjqP8Sbkoyo2TBw5wY0YGR8rojJbz\nDIqwcXnBznOoRHiLgZ/esaMM0jOU7wwMlNF+fl5NJp50EZ4mhxl6GKORI5AQFye9ULHwhrlhYFiY\njGnZUsZqmoqr1zR5rFevEvtZFlPOhco7uNzhcrlkxpNPejUBWe9pAir/wTPHoZtuLpxJYf7KvVFR\nJiliWZ/FuWSb27h8QTnNSpU++ZfawSoqHIqzRRtx6UaiWHGsqp4vsHXyeNXfX8YEBak8At3mfO/V\nV5sTiuE7SAgONo8pyR5eVrqJkq7LtneXDoP9dXBgYJGJuTj2XKuP6UWQlzxyIJZY/BFleRZ20pwN\nA7ZwqCQkxscX4SqyTtaJlpc6IThYXp082RQG3ui4rclWRsatIVy+CAiQu3SHs+F4vhdkZP368phO\ntufSV5zW7Nrzva5LyWFcVeByuUyyxPG4J8t5HnfPTTfJ47rGNwxF6T1df37jUZTdM1BZ1DNwJ+Ir\n7lkYmqidNGfDQHmFg10J7jyx/fvvWdm4McMdDsbFxPDCHXeQ0LgxNdPSAOVzMGo298jOZuO77/LX\njAySZs8madEiemzfDkDO5s3MmDKFO3/6qUj1toEijAbuzs3F3+EwY+r7AK6wMLJq1KCmXvlsVEwM\naT4+xDVubFY+O9/r+uGmm4qtpmajECKFldlWLV5M1y1bzEp6tbds8VoBb9XixTTctYv0/HzuFuH/\nUOVQX0RVhbsLVYv6KDBT/xsJPKSPMeuz8Dz/4bffJu7ZZ+lmyS1Jmj0bl8tlV5Czce4oj2S5GB+q\noOZg2JYNegpruOn0jh1lZEyMLPeglP5C9xWsDAyUvhERpnYxHqRdUJAMdThkTMuWpr/C0Ab6o9hY\nPakWiqN5sE0JFw9WChBPinVv2oP1OY3UNJmpawzPg9wPMhRkAEh7XWvoQaFvyRsvlpWZt2tUlKwE\nr1TmVq3VRtmxN3WvDBs7TDqN6CTDxg6Tval7y7SvqoByag52tNJ5IGnRIlY+8AD1nE6eystzKx05\n5bXXmDNpkhnhIiKkJCdztV5GMh9Fd9EXmATMRSWrLQE6VK/O9Lw8eljCJhOBF318uCYkxMxMFimM\nYpr6+uulMovaqHiIFEYfDYmO5oEDB4ow4SbrVBveGGCXORy86+9P1NmzZADB4eFkhIVRrVo17tm5\nE3+Xy2R4TQwOxuHxTF0uF/fXr8+nR47QrXZtWhw7hgtwaBpay5Yme6tLhM27drHi2DGbeuMckJqW\nStcxXUm5MUUVXM+Dplub8tVbqnp6cfuaNG5Sqf22wo5WusjwtC0bfoLiyO2sNnwXSE+QaSBdLURs\nRq2Gu3U/wvN33CFjY2JkrKbJdJDHfH2LlOT05r+wHcgXD9bn+pivrwyJjJQHwsNlRLVqMqJaNXmg\nWjW5p169EhlgR2qaOFEEiq9Onmwe46QogZ9nUMOrkybJMl3DMCKbJujfrc/fjjw7PwwbO0x4BuEF\ny+cZZNjYYSXuq0qgnJpDpU/+pXawigmHxPh4sxCPUZjFMzLJOjEbpqbpHTvKLaGhsoSitYqduvnI\nrPhWUKDCWS3Cx5gcPM1ZVgfk5ehArorq+/kIZG+O/gR9UTAepFNAgKyMi5PE4GC3YAZrsMOrkyZJ\nVz8/WfnZZybdt9XkaIxHa4U6e+Fwfug0opP75K9/brnvFqnVvpbXfZ1HdC5T2xdrTJdXONhmpXOA\nSKEpwUgmGxwWRkj9+gzYtcss4uLNrJMQH8+ywYO5x7Kth/43icIiMUbBniarVnE8L48eKMe2URzm\n8DvvuJmzvujcmZoZGZdMxvG5wJtq3/DHhrSOas3RzKMcO3yMOg3r0LRWU16e9PJFU+nPJ2vamkx3\nMj0dduygugi7UAWcHgR21apFm8hIduzcyVWorOmaQDpwMjKSI04nrf74g43VqvHsH3/Qk8Ja4dbx\n2LJVK8LatuX6226rsOzuKw3Dxw1nYdhCNe4M5EHoF6FkVs+EDhTZNyxjGAveXFBiuyWZqyp6/NoZ\n0hcR3iaFxOBgPqhbt0RaZRHhvrp1+ezoUSYBUUAWsFtv4wzwBYUveK+gIIJ9fYnPyGBwWBgxrVsj\nIuzavZtPjx5lIorOGy/nupxQ3AvKVyjKyM5ANrAZAjMD6da6G69Pe/2CC4lzyZoWKZqtPHviRDJ+\n+olNGzcy+uxZ+qKy2z8EaN2a2lu3ss/Hh4T8fHNM9NQ0aovwPmpREQscRDHyFkdxcqlRlVQleJvE\nQ78JJfOmTPADNqDG3zlO8MWN6bIIlnNFlRIOmqZFoJgAuqIWPM+IyCdejnseeBY4S+GceIOIpHk5\ntsoIh9kTJ5KxeTPJaWm0adIETdP4PT0d12+/8XZBgXmc5+osMT6e3MGD6Y+aBBYC4nAQ43KRjnJI\n97Cc5zV/f27QNHrl5prOSBHBOXw4vXJzSUJnOfVyrguJ1LRUps2dxqEzh6gXXu+Cr9Y7j+zMmiZr\niu74FBiAEgzn+ZJeLBjcWz3ef9/tGXmrnzERSPH3Z3ReHvmocSEojqxrUNTtz6K0Sz8Ul1YmcErT\nQHdCiyi6FAIDbfqMcsIY7ynHUzi6/ygnsk+QGZEJrfQDtgACtbNqsz5+fZnGXHFjunNqZ7794NsK\n7X9Vq+fwDmrCvwpoA6zUNG2LiOzwcuynIvJABZ//gsIoAJMxahTtx40zV2cZNWrQxyIwRApplUWE\nuGefNWmz+wH/BEa7XPRCmQV+AFbqUSYRNWvyS3IyT2VkACpHYuKsWaBpzNOjYboDfR0Ovr/tNjRN\nuygUzm4rqRpAHvxvzP8u6ERcL7ye0hQ8NYcAfdsPFAoG1N+UG1O4c+idfPvxt5UuIESKp3Bf/sEH\n9Ac3SvMewOa8PL5A0bYvQJHvBQKrUZcKyvw4AMW/dLhGDW6NiSGsbVs3ivFVo0adc02RKwHnssBp\n0rgJL096ma5jupLWOc1cgLAaaAd0Ut/vyrir1LFmnPfX3b/CXqAtUF3fmQd1w+tWzAVWICpMc9A0\nLRg4BcSISIq+7T/AQRF5xuPY54GmZREOVUlzsPocrOac4laHoF7U/Pvvp09BgbkKjAZecTgIdblo\nCfwIXF+vHtfeey/X33YbjBjhFs76qr8/N+qahIEVwM9PPMFTs2dfhCu/uOqwAW+qPd8CDtSL+T2F\nM6YV30DT8MrXIEoKMZ41YQJb588nOiODFFRFP4AQ4AjwV5QfIgJlQhyFUskNYbIM+ACo43DQ77PP\nzPocxY1RG+dn7y/WtPkD0N7998UJnmLH8S1AcNX1OVRkhnRzoMAQDDq2osL6vaGPpmknNE3brmna\nnyuwHxcM1gI2RkUu6+owafZsPAXZ9u+/Z1P79gwPDaUnkIrKbbjJ5cIH6I+yH6cePcrkv/6VbevW\n8WFgIGOAF4Bxmsa60FAW16zJvYGBDAfGoUpSrv3ooyLnu1A4dOaQ+wsC4A+Hzxwud9upaakMHzec\nziM7M3zccFLTUgG1cvvqra8YljGMzqmd6Xu0L1F+UdAatXpzoV40K/IAH6VBTJs7rdx9O18Y48Ka\nrZw4axYz9UzlG26/nQecTm6FIlnS3VFulaYoGWgUburp58dQ1LjYjHoc77hcfPbMM+Y48DZGbShM\nmzutcIIGU9MsaZwUN+6rZ1dnWMYwN8HQdUxXFoYtZE2TNSwMW0jXMV0LTVMe5+VOqL2utlsbVQ0V\nKRxCgT88tv2BKj3sic+Alijz0yPAdE3T7qvAvlQ4vL3sVioMz5dRRFEbPDF3Lu3HjiUmN5c6KLtb\nJPAu0AhFl/EOUNvpZMaUKVx/222cOXWKv6EmgTdEuKZZMwbNnUtobi4foUwOLwJjMzIu2stvmnis\nqAB1uKSXCpSAWPDmAr794Ften/Y6vr6+ytjuAg6jVCijX4bK34oKE1znC2+V8Gpv2cLBN9/ky88/\nN2lK3qtXj4+rVWOQvz/3ORwM8ffnk2rVeLduXTY6HGY96n5AeI0aRHfoAHfcwYGYGIY7HGap1dED\nBuByuVg1Zw5ds7OZBXTTx6ghOIwxWVU08YuN81ngFDfue7XrxYI3F5iTekmCp7jzxlwb49ZGVUNF\n+hwyUWZQK8KBDM8DRWSn5et6TdPeAAahhEaVhLeXvdu2bcqfUEydZKMc5LZ16/jGx4cpevTJI6iV\nYXPgRr2tYcDcN94geelSHtWdlMZ5um/fznszZzJSr/Hc2eHgoRYtaFCz5kUrGfnypJf535j/FVHJ\nX37r5fNuMzUtlTuH3llozwW3l8rTXDVt7jQO3HrA/UVLh5BFIWTVzlJSsx3KllvJdlyj/vR6S8Ta\nL8nJxOv+h9LMPa9NnswNc+e6jYMHz5xBGz+ebgMGMOnWW+ntcgGq1Or7y5czY8oUemzfzpco09RX\nFJZm3fbjj1zftu151Te/XFCcD6ukcVLWcX/ozCHli7NCFzznc96qgIoUDr8BvpqmNbWYlm4EfinD\nb4VCc2oRvPDCC+b/nTp1olOnTuffy/PEtnXr2FmjBj/cdJPpdP7pl194JDW1+DrJ+kTQddIkdrz5\npknE1wsVmSJAT31bX2Ch08lve/fyFfAT8LvupK5eowZ+u3fTx5gMXC5Wh4Xxwpo1F82ebJh4ps2d\nxuEzh6kbXpeX3zr/aCVDY0jzTfO6qvp689ekpqW6te/1BbwKrm99PemZ6RUquMoLz1BRax1qQ8Ms\nboIWEdYtXEgOsAk1TlLCwri6dWvC1q1DRIosVB5xuZj373+T06YNO7duJU4Pg27ZqhXp779PwHff\nsSMqivnnWN/8csL5LHDKOu69CoB0SP0tlZyGOYSuCSWzY6aylVyg8blmzRrWrFlTYe1VdCjrx6ix\n/DDKMrwCaO8ZraRpWl/gOxE5rWlaLPA58JSIFPFsVhWHtKfTOWnRIv42dChXN2tGDZ3HBtSLfSI0\nlL5r1piOyH+2aMGILVvoq0/uoDuUgacs5/gc5Y9YS2F878R27eg+eTKOkSMvq2Qm09GnO/aKOPzW\nFnUqDx83nIWuhWq5YSwnroVhjmG8POll9xf4IibFlQZvyZPFOYtFhNEDBtB/1Sp65OSobcBoX1/6\nf/IJ3QcOZGBsLNcFBeFwOMyEuggRjjkcNJowgTbvvmuOPT78UJlDN2wg3+EoMVHzSoDhA6jocVLE\n6ZwOvut9KehR4JYncV3962ha9+IkbVblPIcTwJMi8pmmabcDCSISrh/3MdANddsOAm+LyNvFtFnp\nwsEzAuSvP/zA5PbtmbthAxPbtSPqjjuYOmOGqVF4TgS9a9em7TXXcDAlhTOHDuGDKgF5GrWQcAI+\nISE4s7KIAqw3YkVAAIl33XXZZUHfMugWNvy+QQmCExQWUraGCga7R0N9t+47uozv4vbC+Sb58s0b\n39Dx9o6VdCWl41wyqpMWLWL50KHkREeb5V73pacTuGsX9OhBv5EjSRo1irNduvDO4sXmODTHWkAA\nX+Tm4tC/D4mOZuShQ3yZk8NcKFU42SiKsoa/WgVP6m+p7uZSuODRfZ6oUnkOInIK3BgijO3rsPgj\nRGRoRZ73QsMzAmTm1Knm97u2bOGjrVv5MjaW7gMHevVNjM3IQBs3TvGVjBhBt+xsZqOKx6/Sj/ur\nry9Xo+gSxgAngoII8PPDJyyMls2aMWXePK/ZtpciUtNS2b5/uxIIRmjfclQhg1BUPCcUcRa+F/de\noWDQ9xf0KOC9uPeqtHDw9D8Abrkw1m2r5szh7fx8JoaFEXTLLTzxl79wd4MGJDmdTEpPJ2nOHOZl\nZDBg+XJmPPGEWQcE1Fh7ODeXLylMqgxKSQEReuCeU9Ft+3ZGDxhQ5vrmlxPKMtkbx+w5vIdfDv5C\nZpdM5dPaDIsHLPaajW8ET4BKdkvzT3M/cSUHSZwrbPqMUuCpCbiAgSEhfJ6V5ZbZSmwsc9evZ1C7\ndlwXFISmafyUmkqbJmrwhLZpA0DmTz/x69691Dh4kDygAWou3KFp7tmy7dox94cfmPPMM6YwKCmf\n4lJCsbHja4EuFGoPbZTJyPrCXazs0sqAVcN4zd+fwz4+5HbpQsCKFfQGfrJmzgMvBQdz9dmz5AYH\nUycz0+Rh2hURQZ2CAvwyMuiPcgZmooTCST2bWoC9u3czTjdVXSkoS66D2zGG2dNLNn7oN6FcW/9a\noutGFxEwlZEX5ImqlOdwWcJTE1gFjNQFAxRmttZOTmbgLbfQcNcubhs/nlvHjiX69GluHTuWkFtv\n5Ym5c5kybx4vrFmDo6CAvwMZYWHQsSM7YmKKZMvetWULM6dOVdElZcinuJRQXGifORr9gc4Q+l0o\njwx+hOHjhnPrkFvZtH7TBQmnrQqwhkoLcCwvj9dzcjiYkMCbqLoex/Ly6KknQvYAmmdnE+NykZ+Z\naYY+vwU0qFmT4QUFxABfaxoHYmLQ7rgD7riDiA4dqN+1K75hYazIz7/kx9K5oiy5Dm7HiDqGLRTJ\nxs/sksmGXzewcMtCbuh7A9+t+85s4+VJL9N0a1O3MOumW5Wv4VJBRdNnXHbwNAms276dq06e5LPI\nSBz5+TTVaS7S8/NxbdrEPGDi7NkgwryMDEY98wzVjhwxwwdXLV7MqDNn0ICHnE60ceMIWbuWuAMH\n+Ckjw9Qc9vj5cXrBAlbo0SUul6tIPsWluuIrlhbDusbxh2ZNmzFqxqhCyo6awEpUuFcViUqqKFgX\nIUmoUrGzgMddLpUjAVyH+wJiCEogvOixvX9KCrhcTEHXfMPCeH71atN8lLRoETf+85+XxVg6V7hF\nvJ3G5Ef6OqswOs7tGA01Ng0hYYU/akx2gcy8THo90Yttn26jSeMmZYpyuthcZecK26x0DrCamEY1\na8Z9Bw/SIydHOf5QdAc9UE7kn0V4Ki+PZQ4HAS4Xqzwc2VbHYDcv0Uhu5HtBQcTVr8/83bsvC4fi\nd+u+484xd+Ls5SxKJ2Dhm2m8unFRp166fmxNaFzQuEpwKFUEDAZVgF+Tk/ksI4P7KaTjngWsB24A\nfkXRS0WgmH0jUHNZFkD9+oQeOkRDEabobVud3+cSOXU5ot9D/VheZ3mJpI0TX5jIsr3LlCabj2KL\nCzV1pfYAACAASURBVMErTTc/oNLY9e9lNRtdDOruKhWtdCFQlYSD1Sa83NeXJc2a0ahWLdKOH4cd\nO3gf3PwQJq020C0oiC2PPUabv/+9SNTKss6dqWWJRrImTGkok4IRhmj9XVUJRzyXFZD5UjROUSGp\nTuAYima0H24vSpgWxpabthRtRGehu1x8DVYYY0yys03mXQMrUKr+Jyhepb5AQyANGIvKIM2qXZsW\nzZvjcBRajK2RbedTi+JyQtdBXfl6+9dq4h9Akck+KD6Is/5nkUhR5HjBKG31OPhG+lIQWaB+60JJ\n444ULmgoHJOlvRMXwydRpaKVLmeIiIoU0V+qPgUFrA4PZ/q333L/NdfwIEUZNr9EvdzdAXJyWLdg\nATleolaaREe7USxbE6ZA5UPsF+HrmBizNrC3aJfKwLmytbrZczvpG/OANSiH9B8QKqG8Mu0VHn75\nYbVU9mZ+yoNffvmF4eOGVzl1vDwwzJjbU1IIy8zkE9SzzsrKIsbpJB24H8xM+wUooTEJJVs/TE+n\n/VtvmUR8xbVfWuTU5YjUtFTW7VunuBi+x6uZKKdOjhIKm4FvUOQ/twMbQfKlUHvIQwmNMxRhVy3L\nO1FSRnVVga05ULQgi7eQ0aRFi8x6CgaM6mwH33iDGgUF7EWNmwzA4e9PRl4esShNIhS4voTYdmsU\n0qVUpOVcV0DF1mj4BnXjdDru+jn1OdjuoEoVt6j+RhQTm7ngrJYXE97GnLHt+rZtKRg6lN4FBUwC\nt3yFkSgtYhkwE+WXSGvShFV79piRbsBlEQJdXriN1TV4T75cg7qxnmPuOHCf5fjTqDGYjiJLaw1N\n05qafobS3glbc7hEYOVBMpzGnhw029atY6u/Pxtzc0kJC6Np69YI8N+PPsIvPBw5eZJxqNXbUmBO\neDhdYmJA09BQGuj6EmLbrZz/VU0AlISU4yleV0Apx1O8Hl+sM/oAytGsJ8IdXHpQ/d8OZdfNQxHC\n56KI9yz+ieK4mC4leBtzxrZ9HTtSo3173tErx3k6pb9EmZj+AbwH9E5LY8aUKRx/7z1G79pFv6FD\nr2hOJQNuq/VWFBbJsAoBAe7EnUG1M0pF+x51068GdlAkrHX+nPlFHdoUtmPVCi4EV1lF44oPZfUM\nETWYLT1DRm+4/Xb+r6CAEGBYQQG3jh3L8chIav/xB1efPIkvheUa+wFX5eTw/OrVvLBmjfl58b//\nLTLxX+oUy0f3H/UaXnp0/1Gvx3sL8SOBQsEA6mXR1XETfqjIEH9ULoTFzlvV1PFzhbcwZes2/xMn\nuOXxx5HcXH5E0bg/WL8+4zSN/6Fk5Srgz+jmJhG+eustumdmwrJlxD377GURAl1euDGsVkctPNZC\njcQaBH4SqARDNmoxctryQ3+gHkoYtEdps20oEtb6Xtx7Rc9jwCPk2pOOvipSd1/xwqGk7GfPmg2S\nk8MRQMvJ4f2nnoLly9Fyc7kHeAx3n8NDWVnMevLJEmmSi6MBv5Re4Np1a6sVlwdtdp26dbwe761G\ng3++f6FgMNAWpYL9D/VCdkbZewNQqrwVl3iug7cFgnVbt+3bmTduHC1EuA1F4348Px9Xixac0jQy\ngb9TuDjpC1yVl0cS0E+E/np9c2+U8pfSWCsviixMghV/1+dzPieySaTSGAagxtsGCgVEnuV/fxRb\npiedqEVbLmuOg5WOvipSd1/RwsHb5Lzx3Xfp6qVmQ/dt2/gSZe9NAlwpKbzjcnEEtXL7EjWu7nM4\nGBEeTnx4OMlff11oLvDQCAyCte7bthVhdb2UtIfoutFqFfUDSkj8ALSBpnWbFvsb60ux7N/L6NGu\nh5rw1+htrEGFEBZQVMXvheKitrx4DdY3ICMro0ixoEsB3sZgwsyZzB83zm1b0LFjzENFroGiZJGr\nryaiQwf2tWhRJCBiKErefgn01QWAdfFR3Li8nFHcav29uPc4fMvhoqakLRSam6zFCPxRUXZW5MHP\nv/5s5kpUda2gLLiiHdLewvqM4u1GCKERanp2714G7NpFH5eLpZrGNhFi9ePrA0+iSjk28/OjrU5J\nYI0pHxIVxScHD5ohht4I1qDqOp6LQ0XEa3sj1GMlqnjy3V5+sAoa05gmzZsQTjjJR5LZf9P+CxYv\nfiHhbQw+4uNDL6fTLPSThJKVfYClmsbyli1VLQ99nNzbpg1+ycnURPlGQcnao6iIJms4rJWp1S4l\nqlBskMTnqBt6LUpT6KRvz0MloAyhSKCEle6lsmE7pMsBz7C+/Xv24MzIICMsjPXR0UBhqOmREyfM\n4ir9RFiNemn/htIYBBgIvO7jg+/atUVI+O4/coSZU6fy9Jw5bgRrk/RwWCuH0qWE880EBcxtab+l\nUXBrQVENYTFendfBp4P5NkElvw0fN5z99faXqVhQVYS3okC71q+njtNJfFgYV7dqxa9bthCvZ+L3\nE2GNpZaHiOA4coSFQF+Hg0M+Pjjz86mHerl/QCXPpQM+MTFE1KzJifffp28Zs+0vRbLHc808LjZI\nIhJlYlqJCmc1tq8GbgMWUegH04tMHU69dH1fnriiNYeywtvq7jVUCH4vlGl8LhANBPn60vfTT81q\nXdZM1F4+PnyRm8tXS5a4FZ5PHj2a4++9d8kT6nmDN82iwfoGODVnoSqfhwplvRU3R7PPQh+cNZyF\npiU9k7pLwy58/dnXwOVHxmcda8bYuOGtt9xCqFcEBOC3cKFZV8Tz+IOvv85pp5NoCkNefwbq9OzJ\nWytWnFOG9KVG9ng+mmxqWiqdHu3kpn2SAP75/nS8viPkw3dbviOvbp5iZr0a2IsyLZ1AqWZ69cGL\nSaxXGmzN4SLAurrbv2cPBWfOcCo7m6ecyvDYD/i7jw8NbrsNEeEfM2a4cSGBekkfczr5c///b+/c\nw6Oqzv3/WUkIISQhyCUIiokRrQYRFDWKaIJyURDR0moLLZa2tKdVjlJrrS1HenhOf2or2Mux1h5R\nT0URAYECcguJgFy8EOWiiMbJAQMJFwm5M8lk/f5Ye8/sPXtPmCRDMgnr+zw8ZGb27FmzZ+31rvd9\nv+/3nUTSsWPMs8ST//788yytruYXnbBDl5vQ2aHaQ85cwq2oXMMY4zkv9OrRi6PZR9X212zukw2N\nJwOV4imxKR2yBaMb/Awly9z4y6uvstegUJuL+Rfx8Vy1ZYtqRxt0/N+ff54f+3zEYg8nvQ28fOCA\nrec50KT34Eazjva52ZSwXqhFOyM9g2HnD+PgloMqCyuAHPAmekmrTOPVP78a2ISU45DdMENKcdvj\nmPGnGS0adzTqLGnjEAbcWj4ybRrCuCkF8DMp6frggwBUTp/Oyldeoc/w4bz03ntcZnDTG4Fdq1bx\nX4mJDmaT2e+3s4mguXK+a3EXMasy/jZ2e+kXp5OXmKdivaZI2odwoO6AP+lc6ClUYkMW72LgBwOZ\n+/fo4YuHC7deIDedPKk0tizHrfX5ECNHuh7/w+pq/gX0RYWTjqMu60VA7+JiVr3yCr3DrJB2Y1FF\n+9xsaeXxKU6pDUoQzPf5NyEu6qzkAm9Bw6iW9RZprspAW0Ebh2ZCSslLTz1Fn4suYvX+/fSSkq+B\nysZGur30El1PnFA7rePHyZ41i+Fbt/ophqBiv2+cdx47rr0WKSVFhYVcXFlJMvBITU2H2aGFC9d4\n7mlcd/sJpxO4wXMD/VP6M2HaBKY/Pl0t/N1QO7pb1XtKvCWMfmA0g3sP5tANhwLcdAk0wrCLh7X7\nrqslcJO22P3FF3wOvG/kwCCwmAP+46WU7Nu1i+4xMRyLi6OyvJyel13Gkc8+4/rGRh4FZEMDs44f\nZ8727YCqmn7k97/nj48/ziNGJbX1M4K9ko4wN0PlD5ryJD3FHooPFKtQUSyqQC418D5PsUdtQvbi\nr+C3IR5ljfu45xzO5BW0xNtpC2jj0EysW7qU8/fv59T559ub8wBHPvuMHxw54t9p/c+TT3K+lCwG\nMlE7uz7AtqoqFuTns27pUm6cNs3m/neUHVq48FeCWoT2YqpjaNzU6MgljM0ey/KXlwfYS982ktR5\n2BUxjZuncn0lZGDXaQIqPBVt+A0jh5Yy1KSU/Nvdd3OBEHzcty9Devbk2fffZ8zXX/NQY6Pf6wiu\n3Tny3HM81dDA0RdecFRPu3klHWFuNrfyePPWzYx/ZLzq9BYUJsosVu+bPW+22oRsQu3uQsnNuxih\njqyzpI1DM2DupsZWVeE1CosAv3rmS0VF/oV+bE0NLx05QuVll3HvgQPcaVFUXVlVxfply/w7xW1g\n7xrXiUTQMtIzWPDYAtsN2HiskZiCGBq3NPoVLi/sdiHz58zHU+xh3PRxNPRrUN7AUNQxLrs1ES86\nTb6hNXj7zTcpXbGCt4BJlZXUGnRpWVbGTuB91AamKDmZi4cNI2nLFo7s2MEzlZV88/nnWVZd7fAK\nOqpAXzjsOROeYg/jfzaeqjurHGGi9Px0Nry2wS6H0RWlqBksu7Fa/Z3wdgKVwyr9tQ4QnlfQEm+n\nLaDZSs2AyQzZXVNDIdAYH0+s10sXFKf8p8Aky/FvJyayMkiOG5y1DB2NEdJcuIqMHYP0XapWoX9K\nfz+91cEaycefIAy+eXqt6cXprqdtu76OVOMQCUgpubN/fx4oLWUcKvH8NKo/eQyBXtIQkOY2e5nL\nmhp8KMbduSTbbWLqzKks/Giha67Bynbz94DYhCqCMnW+uqJ6PVSj5DW6A1kBAb6M9Iyw2HRnq7eD\nZiu1Eawx2LEYvaS7dGGZ1+tvxvJKly4U3nADJ48f5+v9++mZns7FgwY1GS7oiIyQ5sLVbe4DGZdm\n2OimU2dODRgGCCT7Nhn/gsJQJ3JOQD0k/SuJwVcMJrNvZshdYmeElJKf3HUXvUtL/R7rOFRfhxeA\nq4A1Rs/o8/r0QUpJd9NrqKnhFygKNnScnEIkUVJRonIMZ9i1C59QVrcRZRxijRduxi8UST6qWO5d\nKOpSxMNzHmb5y8vD8gqa4+20JbRxCBPBMdj1wI8svaSHAAfr69ldW8tFycn8s7GRWcnJPDJvnvsJ\nXc7bEWK6LUG4brNf4dXSvhGB4pNXwm1f3saeg3soqy9TN2mher3q5ioyYzKjhl/eVli7ZAn7//Uv\nHscunXEv8HtAjhxJTyEcXupV//gH61GGxMyZ/QEYvXs3/3bPPfxt2bJzwkAMSBmgFvSgMFFSXhJz\nFwVyFKc4pS5UN+z9HPLxF7+RiwqD3gFsgfX71uMp9oSdAzElZaIJOqwUJqxtHHd5PHRpbCS5spLY\n5GQuzMzkk8JCFldWMh74WXw8471e/i0ujkmvvx6y8cq50rIxHLfZU+xhyMQhVN1c5ezh8C+IORnD\nd+7+Dh/u/pD9R/arpsqxqJt7F2T3z2b7ku3t9A3bHlJK7hs0iJ5FRaQRMA7Hjb8bgIG/+AW//uMf\n7e+57jouS0xkr9FMSADHvF4urKvjxIAB1JSVMdOQf+nscOtKmFSexOrnVtvoqFNnTmXhhoWuneNs\nbUJNI2NUUJsFcSZbye8VtFENg24T2sZwyw+YdQ/jampYgQpFgpLXOHLJJSw6cCBk9WlnbtlopfCl\nxKYgfIIKKlxvkKkzp7KwfKG62axNVSDQhGUY8A5KdjSIWZK+Kx3Pto4juNdarF2yhBfvvZe0xkZ6\nozYWu4ALgY9RdQ0n09NZ8+WXfvmLdUuXOuaudYNyT/fuqhizE25QTATTSieMnMCv5/+acl85qbGp\nvPL0K446BU+xh8zbM5H3uaxDy1BNvBtRAlij8BuM9q7S1zmHNoRbfgCw8cEnomitoHpITy8qUqGj\nyZMdOjUdlRESDtwofG5JNk+xh4fnPMzqratVv95YVN2C1TjEEwgjTcSZk9gGPdN6MnXmVJshOsWp\nqKk2bQrN1S8y5+G1jY18jLpc/wcMQtUXTgIeAWalpbF2yRKOPPcc64YPd81tWcOanbkYE1zm5DF4\n48k3/IKP5d5ypj853TURnCASqPXWOjctPVBz0AtsRO0IbyYq2EathfYcmoFgHRuT+WF6DSaWAwtQ\nrRtXxsSw9vbb+duqVZ2elWRFOG0QPcUebpl+C4fKD6k+qmbj9q9RN10XVIzEVMU8Cdzt8mF5EFcW\nR8PkBlvCuqO0EW3uvFi7ZAl8//ssr61lIKqD5Zeo+WZtI/p2t24svuACFnz+OdMHDeLer75iXG2t\nX5n14/ff58g77zDfGtYEnoFO6T045mQBrq1Cg/WRps6cysLGhc5w5ybgCtTFNwowqQbGRceca63n\ncE73c2gOQjXm2b11K6vT05kpBHOAJ4D/ReVQ1wMTGxtJOH4cn8/HvJkzeeYc6chVdLTIftOBrSGK\np9jDqO+O4tDBQyoONxJ14w1B+bM5BDpvbQVOoXpMu3TYogQaRgepuo5CJbUtvPJohFsXuODXg5vy\n7Hn3XRZddBESFUIai5KLF0BOTAw/vOIK5txyC4vT07n7yy/5A3Dn558ja2sBowbn179m15//zOiP\nPnLU61i9h86EkooS+5yUhJyjU2dOJff+XCbdP4m1W9eq6miJMijrUZLdV6DahVqaUQmf4K6Su9rd\nMEQC2jiEiVAVo1eNHMnAMWPoOXIkX19xBXtRycBVKA8T47h/mzSJy44c6bQ3nhWeYg97CveEbB9q\nuvfFucXKO7CK8O1DEe+D5buTUN7EGpwtRmNwdpKLR93Mxt/tXW0aCmdqE+vWlOeRefMQPh/Powg0\n68Df+2FiYyM9kpP5j02bSE1JoYvPxxFUxG4xgUuSWFTENXV1vBYfz/QBA7gvPp5pKSm83qMHfx8w\ngO3Dh7PbkOjoLHC07zSqmm3wwgeFH7AweSEFPQtYsXsFJ5JOqNe6oOoaGlF1DZ/j0FmS4yVJKUkd\n3jCAzjmEDWt+QErJLo+HYenpJG/dyi/nz/cn9k4C/46ad7cA915wAd+4+GI+XbuWzRhue01Np1Rg\nhUBct7pHtWsD9379+9mrRoO1akLs5jiJWgmvQyX8fEAZ6obtS2hJA+NvN1mD4GR5W+cozqRfFKoG\nZt3Spdzj8SCAS1H1DMGblqcefZSxu3ezDhVmmoVK10y//HIQgkmffspnwNSGBj667z4lGb9gQacO\ndzpopVkQtzbO1mRKrBIBL3QTAa/WGko6huq8lIfrXI3WjUhzoT0HA6F66prPPzJvHr975x3mFBRw\nw4MPckl5OSP+/d/9/PF1S5cy+qOPSCDQy/cuoO7YMbpefTWPNTQ43PZ1S5d2uj6+/oW/OyHbh9rc\n++6EtZujARiN8hByUFWtk1HexDdQjVfyUG7/MdRNPBTX/r2mAVuYvJCCjAJW9lvJis9WUNCzgIXJ\nCxn9wOg2aTXalH5R8OvB/cwnNDQAanf3MjAFdTlmXnEF24cP56O8PN5ITydHCATKRi8EDp4+TUpS\nEvFS+vuhv/f88+dEuNPRvjNmCnl/yvM/Ts9PR3aVAS+0HKe0/CiIkTHKFUvGda529ES0CW0cDITq\nqRv8fKgY8e6tW5kD/Bz7Lm7G6dOsfuEF7jCeGwv8T3Iy2665hhUvvdTp+vj6F/6hqASeGY+90RAy\nmzXX7t5fh1rUzcdZKK2a4NBRD9w9inpU3HcyymDcCInvJXLbgNvIPenev9dN76Y9chR73n2XbcOH\nM+eWW/z/zHBOqByX2Y8BVFX+Iyj7OhWDFJaRwe/eeYdFu3bRIyXF3z/6ThQRbJDHQ58PPvD3Q18H\nTA9iKXVWWL1Fk0594QUXAiCR1NTXBCqmIaQCa88+Pcn8OFNRq/OxzdXgjUhHhg4rEVrCwu35UBXN\nV44YwfI//Ym3gddRHQaPAFIIetTW2gzG9+rq6DJzJus7oWyGf+FPRVWPGiGg9IaAkJnNvU8FhkPs\n67H4+vnghP19HEbF577ENXTU/VR3qidX2xb6mtE1JJYmsuLFFa5jDKWC2dY5iqZkVUI15Vnxyiv0\nHT6cRceP0+2zz7jr/PP5UUkJ66XkeWD6gQNq3hqerHVGJQJ/k5KxUvo3MbegqPpX0LEk400K9PZ9\n2yEesr+RzbOzn3XQpGfPm80Xh7/gq4Nfccx7DG+KF64BEmHLT7Yg66VSXO2FymutR9G+ziMQSgqa\ncyOyRvDs7GfVJqN/EaX5pfQb2K/TybdoKivuFFW3FozWxuzWiuZntm3jO5ddxpQvviCfQIx3DPBi\nTAznf+Mb9DK0bT7ftw9x8iTeoUP50f79js/s6AhXRCy4anTCyAn8+JkfUzW8yl+tShmqd+9AlIu/\nA5u+UubHmfRJ6sOOy3c4xpGwIoFPln3ieqOGotn6q12joN2jWZHvJtj4yLx5zLrhBp7ZuZNvdu/O\njOpqYlBe6XIhSFi8mD3vvsunb76JzxB9PF5fz49rargLRbVOQMlnmPRVk/7aEeainwJt7ShoNHkq\n+HsBGekZ9urnYAqqKXuRCGxBeZxuHd6Wo2IrE90/wzqeaOviBrpCutUIJWHxzLZt/OLGG23P33fJ\nJfzg8GFbTYO1b++nPh//Dv4b8K9A8oAB3Pitb/HL+fNZs3gxv7v3XrYBk7t3Z5mhzdTZZDOakgtw\nu5EAh4xB4vFEevTuwZGRR/w35oXbL2RYxjAqfZX0T+nPjG/PYNqj0xTrKXih3wJThrov8K49gztY\nXYSpqtqAisqZi7sEpg8axILPPvPPJdc5bnnPctRmud6Q9E62aDFFI/xqqtYeH2Az6v4NwDZcaxnY\nhLKQX6O8hDrs+QXjuFu/uJV+af1CSl+cLUXVSEBXSLcCZpOUSS7u+1OPPupw63sXF7P60kvZ0SfA\nm2yUkg9ffZWf+3wcw56MXgDMmD+f27/1LRobG/nND37ACFSs2CraFxyi6ugIJSIWqvHJ4N6DAzdX\njjq2xlvD6JLRJFUmBW7MBXYjM/qB0RRfXezKiuJ6KDpU5K+aDt7RyXqpdo0xwGnodrIbVx25KqpC\nA26V01aG0x9RtvRu7HmuiV984a/KB/fEd47x70pUG9Es4EqfDzFzZtTPwZKKkpA9PsxwoD906MZ+\nq8FuDMy8lktlfmO3xpBzefa82WzYvoGjY45GXRe3SCCixkEI0RO1Jo5GcUYel1K+HuLYp4Afon6+\nBVLKX0VyLOFg3dKlHFqzhtWDBtkWfCkl+/PyqAuStugd1IcB1C5uxLRpLAO+S1AyGvjHk09y+7e+\nxf975BGuq6lhAvAXlMzB68nJZA4b5s9vdAbZjKYQqvGJv6ObFfFQQQXzZ833exqz5832L/C2c5k5\nitOoxhp9gXdht9zNjn47lLaED14f8zqjhoyie1J3FWe2LAS13loyK6NL2dVPhrB0abMu9L9EbTRe\nRyk39EKlbI5LyXkvveQ3DiYN+42iInyVlQBUVFbSTUqKUHb1l4DsIDmHASkDVK1BE0q//tyXcDnu\nQ5R6qpWQYKip2no7hGAe2TY53WnSSHVkRNpzeA5lk/ugiIyrhRAfSSk/tR4khPgJapN9pfHURiFE\nkZTyhQiPJyTMHdiq+npmJSfzRH5+i24I88bb++GHxFVX8xHqBhWoXOvpkhJ8Ph+b//IX1qLc+Rko\nIsQNPh8xHWCnFimESgSH6uiWQkrIFou2c6Wi2FE7ge8Y58mDmiE1tnhzo7eRjWs2Eu+LV2JpQbvE\naLqhQ+l4vfTUU1xm2bTsKyqi11df0YCqzp8FvAZM//xz1dRHCEeI6O0332TxvfeyDFVZ/QiBBkEd\nwYOdO2sum6dv5tAmZ85h7t/n+o/Z8cAOirKKnJ7lCdyZbycIzMMQ0toAMx6ZQVFRERxCVe0fw16E\n2UnorBGjsgohElGitr+VUtZKKd8FVgLfczn8+8AzUsojUsojKDmX+yM1lnBwpsrUcPHL+fN5Ij+f\nvqgcwxzgzyhG3O+ABysr+emkSfzMUufQFeU9rL7ook5XhdoUHBWqAF64/hvXK2pgECVQxkqnp5Fe\nxKjvjuKTfZ/Yz/UR9mrVGFTMJaiClTvA292rdo9B44imG9ptfq5dsoReu3dz44MPMqeggCfy84lp\naOBvwPHkZH5w+eWMiolBAHd7PCHn9MqXX+Zuo/7hLiA3Pp4vgYnnndchKqMz0jN4Z8E73HXZXaSt\nTyNtfRoTSyc6EsWDew+m786+dPm6i6qDWY/yMFMJXUuzCdLWp7lSoEH1nN74xUa10t2D2oxsQRkI\n4zydhc4ayTqHS4EGKWWR5bmPUeHMYGQZr53puLOCUBxya+I7VFGcG5765S/5UXU1oHZgALmGxs22\na67h4Pr1/lzEWBS3/EdAw8UXR3XiL9KYO2uuqxF4dvaz9uIk48as8FXYd3jlwC4ozi3m6E1HVVLR\nPFc1gYK7AlSIyYf7DrEa4r+Oj1p+utv8fPvpp1n8+OP8t9fLG48/7qerTq+oQADTfT5q6+uZYPQq\nv7OhgbV/+AONjY22eSylJOHECX9P87uA87p04TngkkGDmFNQ0CHmZEZ6BstfXk7p+6WUvlvKihdX\nOPJRKwas4OiEo9R/s16FoY6jEtBlwL+w19IYeSpxXNC/b+hNwrRHpznlXSYAq5s2Kh0RkQwrJaHk\n0aw4haojPNOxp4zn2gRNVaZaY7vB8V43SCnZunAhtajNSQJwR0IC1153HZdffTVXjhjBtd/5ju2z\ncoF/okLj5xLO1A4xON7v6CBn9Q7iUeyiLahaiG6oTJcZOsgznneT1UiF2NJY+hb0RXgF2VnZzP/r\n/Ki5od3mZ9/CQnIN73NSURFrlyxh/TPP+KU3xtXUsLgosC+zEiuOvvCCfx67nbuzSXU7clulqNyA\nuah7gbdQyZoLUG7+5cCnIL8tKYwvpNBb6A9hWufFSd9J1w2H6C7YvnB71MyhSCCSxqEKSAl6LgUV\nlTvTsSnGc22CM/VRaE5f53VLlzKzspIxBOiB00+f5sYHH2Tc5MmKr37jjXwAfPbee1xWVwfAtcCR\n48f9ceFzBc1ph+jQwgn2BFJRCcT/xdnn4Vbourgr9avqaZzQ6GAy1e6spXZMLXhh78d7I/Tt28QG\ncwAAIABJREFUIoPg+SmlpPC99/h1fT2gxPXunDmTBwyvAdQiP0kIfnj55Qw0yBWNjY18tnAhiyzz\nOFgjrKiwkIsrK0mmYxXBucFkEK3auQq/JAHAdhSlyzo/7gZeJZCALsARgnRjHSXJJE55TzkT4Yn9\nO5VhgMgahwNAnBAi0xJaugoV+Q3GPuO1D4zHQ0McB8CcOXP8f+fk5JCTk9OqgZ7JbW5OX2fzZlt0\n/Dj37N+PaGxkkhCsePllxk2e7P+st998k2vuu88fXgJY20l2amcLGekZLHhsAdMenUa5r5z68nqq\nvdVOTyAJ193cFYOvYGCPgazYskIFUAWB4qeugeOijXrolkC+5r77bIagf2kpf+zVi+3Dh9torpcH\n9YseMW2abR4HM+1unDaNsZbP6qjeg612JRG7x9gF9/BikuW4EIKPVpLCojcXcfjIYRWSuhP/hiPu\n7The+/NrEf9OzUVBQQEFBQURO19Ei+CEEK+hLvOPUcojq4AbQ7CVZqICAaBSRX+SUv7D5Zxt2uyn\nuX2dzdyEo2lK0Ht+Pn48MW+/TS8p+VoIuPxyevbu7aDGagTgr4RtOOSvR4ipjKHxLosnsAn1Wg4O\no5Gen86m1zY5ipRsjeEByiFtaxqXZ10eVRWuJqxzx8Q+oFdsLHe/8YbrQh7OPG6qCrujzcm7fngX\nK/utVL+xWe18NepCHQTuw7mpeM14brRxXBONfzZv3UzOAznI8VLVQ3yIYjedhtuG3saGJRvO9lds\nNqKtCO7nqDqHo6j0z0+llJ8KIW4C1kgpUwCklH8XQmQAe1Dz9h9uhqE9EE4+Ivj4XX/+M9+XMuR7\npJTEnzjBPOMYKSWzkpOZU1DQId33tsLDcx52SCQ0bmyENwioud4IfIIqYroDmwHo17+fI8/hOeBR\nhXMWw8AOKBtTRll8mY0uGy0GIv3SS6mqqgKLhyALC/lbE2HPcOZxRzMATWHH/h1KZgXUb3s5ahPQ\nC+iHSgiaqr5mzqEPKkm4FuVJBs2hpLwkZvxxBqAS0XK8DOS7bg2cx5fka5Pv2NY45+UzgmHupgB2\neTxcnaEWCLfdlLk7O3/nTj5OTuYSo6DNfM3Uwfm3u+9m0oYNDtmNaNewaW+kXZtmrz4FdUMuQwkD\nrQDSUMbD3M2dNI65HtK/SCf90nSbN+CQO8ijSRmGaEQoLTArzqTN1Jye1R0BjrlidtoKroKuQFnK\n87ATGNZAny59qKyspK5PnUpSZ8HAz5WW0rB7h3HqjmC+DbAMptwSnXNFayudJYTT1zecm3TtkiX8\n5bvf5eJBg+gVVIXdEd33tkTv7N6cuP2E84VlKI75agI3uAkvsA7ipL2JS1JeElkXZHFJ/0uY8e0Z\nvLD4BQ5XHGbf5/vUohKEtPVpXD4o+sJMzQ17uqEz9DIP1ugqLSsl72BewBgsQs2R4LmxCOVhuISQ\nEt5IoK6/YRiG4q+HmFg6kd2f7nbV8IpbFMeBvANRMz+s0MbhLMB6A4a68YJv0kZg3Pnns/arr4iJ\niQn7PBru8BR7GHzHYGrurnHexEsSqEurUwToe5zv7bq0K6fvPO2uunqjXRgtpEKrKaUQRUJqYN+Q\n+J9rhhfaGeakm9jdhdsvpKGhgSMcUaHCBuBelzcvQoWTbnV5LY9AuCgP5WF0hYTSBF6a+xLfe/J7\n9q5xqwWv/eY17vvWfWfle7YWrTUOutmPC5qqnpZS8tSvfsVP776bsbt3+2O664HLjhzh6V/9Kqzz\naDSN2fNmUzOqxt4IyAvd1nWjZ1pPFQo6D1WZWkCg+O0YJMUlubNTDEaKtZmPW2Eem1Ca/ziPb280\n1SAoHHSGOemm0XXohkNcd/F1TBk6hfjKeGUc3Kqgk1FFcG6vxQbOx60ollMu1H2zjt++8lv++dg/\nSc9PJ3VNKun56RT8tSBqDUMkcE6rsrrhTH19zQR0z/p63rj0Unb07YuUkk8KC1lcWcl9r76KfFrV\nSQef5+Gnn+aj99/n0U4U6z1bKKkoUWJ8N6B2/BIQkOhN5MhYQ8b7YmAr9uKm1VB5orLpntIWimJw\nwnrfvn2q+jrV8t4o0l1qTRjyTHO7oyCURpcp1Lhk2xJVIGl6AubcWI/qD3ICBx3Vz2CznM+/dTY2\nCKu2rMKz7ey3j40WaOMQhKZYHmPuuYe1f/gD59fVMR94OCWFJ/LzWbd0KTcafPIfVFT4e/0Gnyft\no4/4as+eM1ZdawR1lMsxnvRCbEFsYNH/EqeUwXjwvukNiK2Ziepy1K6xHEiE5NhkVznvqTOnsjBx\noX0wUaa71FI0l4kXrXBUzoP/N5o9bzanxxohRXNj4UPRWesJMJMSgA2Q1iWNbrKbncFmnM/WRi8e\nio5alYE6P3TOIQhNsTyuHDGCwilTuNrrZSywqmtX4l591bU73PnZ2VQXFtrYS/sKC3mzsrLDxnrb\nEqGaqGSlZQX47KYBCMYyYDiwGVUQNQFbXcT5Xc4nLi4uINttaSR0tPooez/ZS9XNVX7aYzTlHMC9\nz0M46Cx1DZu3bmb8z8ZTlVrlZxVlFqvfaPqc6RRkFDjftBqlk2/p6mbmDK6/9npnLYzZ/Mk0GF5I\n+lcSu1fujpp5cCbohHQbQUrJw9nZ8N57zMfSdeuSS7i3pIRxtbX+Y90ShOEwmzoLItU20a2jHBC4\nkY0EsyN8tAHlEzegdohmdfRQIBH6relH6R2lgfe5tCBNykti8AWDyeyfGVVsJegcbKOWwm3TkJSX\nxOo/rubmm24OTTB4C7uEhvG8ueAD/rmWHJvMhl0bqL2j1h52uhqmxEQnbdUN2ji0EdYuWWLzGkws\nj4lhxWWXcVHfgIxe8G4sEvTDjoK2aJtoGo2iw0XsPribmtE1gZt4I8oYXAO8i6MwjutBbBDI0TKw\nKyygyerYaIJ1Lj183XWcn5t7TuWwQi3+5m/lKfYw5L4hVN1aZf/dAduNa2IZ3PqNW9n4xkbb09mT\ns9l5Yqc/12VSW3M9uWx6eVPU9o22ItoqpDst9rz7Lu/36cP+qirM1nZSSmKTk7li7Ngm3fLOEusN\nB6G6vUVSu8gU7/MUe3h4zsNsX78dES+4cuCVbK7djDfVq+QQgrt95QJbjBaha1EG4UuUjPMZdHWi\nBda51BFyWM1dRM90vC0ZXY5S6pWwsXojnmIPGekZ/OMX/+B7v/0eDWkNKux0NcqbdCMp9IC8/Xls\n3rqZm2+62f/SJf0vYedlO13zGqHa3UZT6DES0J5DG6CzxHrDQe79ua4xX3PHFS7OtEiE8lAyEjLY\nuG9jaC77EuA2VHLSZDqFCE+F8hzaa9do9RoAHgZFjLj+euZHoRfaXC+yqeMBe8/mGpR+kqmkahy7\n4LEFTH9yOkXpRWqD8DWK8nwx8CmuvcZJVDpcViaS21gS1icwNmssMlYG8l4motDT1GEljajCmdz+\ncBDOohLqc3q93YuqyipON552F1srAMZgDyWZQm1BC43bIhZqbAseW8ALi184qwbDlrcCf2fBVV27\n0mXhwqjzHpozFzzFHkZ9d5RrFfLE0onsK9unrnkNKj/UBVfZk/T8dPs5CrD/zmtRkiuWUBFA6ppU\nTu486RjTQ3MfYn3heuqS6lSoMhES3k6g7q46x/dt7gbobEMXwWlEFUJ1e2tOl7WmQlMmSipKXENB\nJ1JOcPrbp1Xzn9U4u31dZzy2SjSnonaQ20C8Ibjty9tC7m5DjW38z8azMHkhBRkFLExeyOgHRuMp\njiwn3iyAe+Lmm3kxOZkxxvPjT592dDKMBoT6jYLDdabBLY4rdj1+5/6dgWueimIRncL12HJfuZ1o\n4EXVNOQZz/UBRqDo0RYmUmqslceqkJGeQXL3ZOpur1NeaKr6jLqkOtcius5Ad7ZCGweNiMIsKgtu\n+dmcXXQ4i0qoftQI1O7yPOPx66i+rG+hYs+gdpPBVbKpwI0gkyR5H+TxzYe/ydSZUx0LfKixVaVW\nNWnMIoFfzp/P7955hxsefJAf+XyuOaxoQqjfyHPAY7uufoMbi+vxDbUN9mueSuD3DTq2vrxeVc2b\n3mAO8E2Ul1GAaim2EdumIXZNLK88/Yrrd3D9va+Bbuu7RW2b2UhBJ6Q1Io7mdHtzQ1NFTiYcXeJM\nz+ByHCEiVqPqHrajZvxtKAOyCbtq51qgS9OtIkONzS+9YOIsJrTP1MkwWjB31ly2/GSLasBj+Y2K\nry5m9AOj/dfVn2QeSqB2xVJvUF1e7bzmWYrCGsxKqh5dTdz2OBriGwK/Lcb/d6DyS0NR2lmngB6Q\nc2mOLRlthevvnQhjssaQVJnk2u62s0DnHDSiDuEmMs2Y8Lr313G6/rTaUYZo/BO3KA7pk/im+Oxh\nhw9Ri8R5QDWuKq/WGHkonn3VcKNozvK+iaUTSe6eHNV0x7MJf7OmI4fUtQlSOzWvqy03YTKQfKiO\nMGNRRjsBGy05YX0CV/e+msKiQmq71SqCwZWoTjI+4BAwCXvVM9gLJ42/m8oVtAU1+2xBU1k1Oh2C\n9Y6a2pntK9sXUGA148vWRcbgqQ9KH8RnRz9zhiduRcWjL0YlOl1CRkVHi2xSG2by2RzbjD/OUAyZ\nHoEFZOAHAymsL1RV2BGkO0Ybv76p8cyeN1t9/3dxVrJbPCubF2iE91gD9EAt+n1QyeBNqH4MvaBu\nSB3b9m6Db2H3EG8yjj9mnKO38bpRAOmPxXmNc58hV9CcudjZoD0HjQ4Bt0Vo9rzZTjZMHjAE2IUt\nPCHeEMj+0pXh4m8XmYbr60n/SqLqzirXnaO/IO9oEaUHS+nXvx+Z/TOpqqhixYAVEaU7Rtsu9kzj\n8dOaCzgjVdhT7GHGr2ewcc9G5cUZzCBWG38PxH6e9bh6iP6wUXBocRNwGrjFOO8mwAcDe6pmPp1x\nsddsJY1OD3MRCmYDfXH4C9dkIWsJLAyo/2V/qV7Lx85gWmP8nYbr67GrY5XOkkuyefPWzQyZOISF\nHy1kR9kOiq8u5ljdMebOmsspTkW8sC4cFldb4kzj8cfrzVxCEwncjPQM0vqkqQU/FihELfQ3oQoV\nwc4wq8b1+iJRHmPQ788olNexFmVwYoBKoyBSwxXaOGhEPUItQmWHy5yMlUQUjTV44Yg1XjMoq7yN\nEuirQal01ge9ng9sgR6NPey5BPPzDxcx/pHxyqO4FbWj3QVF6UU8NPehkEwdtxCGp9jD1JlTyb0/\n15UhZSJcamhb4Uzj8dOazeu6BRJWJDCxdKLN2zG//1sFb6kQ1I2oxT0LVdl8BNWkpwQVLgLlBYRi\nq1mNiIka4/k+QE9gGJAAh644FDW9OqINOuegEfUIpd/fb2A/Yj+OdTKWknBlt7AaVRE9FJVfMNtI\nmppMK1AhjRigEbpVdmPk1SNZ4V2hFhczh9EIX9V+RdUYu0dBLrAN1p9az7o/r2PHkzscIZe5f7XT\nHZsjxdCDHipsFoNaHGOAWPA0ePzSEWcLbmG9UMwtzwEPuffnOvIzKRenIGMlFb4KZs+b7RRS7IW9\nYG0XMBl7aMjs0ZCKk21m5hz2YR+XKax4H/Z5MkIde/ii9jGu0Q5tHDSiHqEWocy+mbz25GuOmH9a\nUhqFHxTaKZS7UCqtZilAH9SCH2/8uxYlp2HmHLzAOrhv/H3s+tsuDtUesi1Ex9Yccw9r+FSR1AuL\nXwgrkenwimqgqKKI7CnZjL5mtD/B6yn2UHikUI3PrBLOUZ9Z7LVTQyONUAZswWMLHAYwbm0cxTcU\nU9ynWB335A6//EVwfmLHAzvISssKPGfd8ZuhoRoCzZ66oHpy5KGS1ZUoSqphzLvWd2XkiZFU9api\nb97eAM31Q5y0VsOQ4+t8xWuRgjYOGlEPt5oGcxceqqbCmije+8leqoZWwR7gLuy7x+tRu9B9OBoH\n1Y6t5cdzf0x2VjaHLj5ke+108mn3eocy4A44fDKwG5XY49rWXfgn+z5Ru11zt2wkUo/GH2Whd6Hf\ni5g9b3bA2G0jsNgZrKwiXxFXjr2SwVcO5pL+l0SExWSOc8OHGzgae9RmTIuuKnIYQM8BD8VZxepa\nGiyxoqxADsItNFhZUKmSzajj/ddU4qqfZOaIErwJDMsYxsfFH1NzXg3Ewunc0xz4/ADDzh+GvEBS\nll9Gv4H9+NL3JUfjj9q/nGHIk8qTOl3xWqSg2UoaHQJuvR3CXfya0u1hG2oHboYrgpEHfX19ldib\nFeWosIa1kdByoDuQAGlVaVTICltPAJswnFtjmY8IyeopqSgJCBqaXH0XTSiz74DZ/CbUNWqJsKGt\nleZHkFqTyvjrx/vfmz05m52HdzrGk90/G+Jgx/k7bPRihkKvd3px4vYTduN4tXFNuqA8hGuwSV2Y\nvRkcOkqgchJ5KIJBI1wYp5o4uQnlxS2JI+/FvJAFcB0dWnhPQyMMhFKL9ceeFwPfxmk8CiDNm0bZ\nmDLna+YCdgKV1I7H3rPYpZvYgA0DKBld4jyXGR4JrgdAFWn1T+kfoO0WoIzINtybHRnPN6Uq21Jh\nQwpQi7uLSGEoA5yen0796XpK4kvsOYI1EH88nn5X9At4RUUoA+LW39m8juuBMUosr/yO8sBnuRnL\nt6Gf7Ee5KLeJ51kbBHVWaCqrxjmBcBk9oRBSi+kQasFLQS3mVprrJqAerrzoSpL+laR2pAWo3Wk+\nauG/FVWJe5yAYYAAffIjy+fFw+Gaw/bFsxzYBl3Ku5BwOCHAxrGM0fSUMj/OVK97gZVAKSr0YoUl\ndh+KxdQUBdW8zqt2rnLPqVThoIkWXVXEqO+O4kT9CWWYyu3v6TewH3WNda5yFt4ML7JeMmLvCGWg\n3ydgGMzjci3X0fL7pMamqsflqN8lHxUoN6+JwUQrvaNUqaiOhG5bu3FXyV3sXrS7UxuGSEAbB42o\nR6g6h+YYCDe1WPKB21GLaQJqsTdprNvU4/iaeA5UHrBTVt9FaTiZO9l4FD3SbTG1Or1ekD4ZGIO5\n070R6r9ZT9236ojbHhcwEJZ6gIz0DBY8toCkD5JUGGwyim21A/tibNI5m6j8DUVBLTpa5L/OpxJP\nuRvTKpfvWQPFJ4up7F6ppCs2W8ZkEAdiE2Pdr08MHLrhEOkXpfPO39+ha5euoa+jabBrVc/vV55+\nhQu3X6iuwY2oNqAjjWtqVsgHGaTaMbUkpSR1yqK3SEMbB42oRySKv0wZhPT8dOUBbEOFKvqgdqYn\nUXz8HONxjnoc640NhDyMz+YOAoVZoBatOtwX08bA3zGrY1T1tikl7lKs1TCugW7ru5H9abZD0faF\nxS8EGDjmWEah2Djm5+UDWU2rhIbyokoPlgaus0vhGpuAvkHf06SJ3mN8F5PttQooUnH9ldtWUnms\nMnRdguHl3HzTzUzOmex+XBl+g83tMCxjGDffdDPDMoa5M5HM3IaLoYnGDn/RCG0cNKIekSr+ykjP\nIP3SdOUB5GDb+SfHJ5OUl2RfDPOgtmut+072tOW4tQQkGazvX4cqrsuDmDdj6NPYBw6i2EnbCNme\ntLZfLYdPHnZvkelyfJovjexPs0nPTye7fzZTYpqWSQ/VcyOtf1rg/JYeF12WdqHnqp5qYc7GbjTc\naKKjjNc/hIbzGqgcWUntmFplMII9t6HYvJyQHt44Ar9ZPFT6KgGo8FWE9jQacTU0ZqvP1oQpzwVo\nKqtG1CMcCe/WnmvizROZO2uuSqrGFavwSD0qXORyfNyJOHqs7MGJ7idUSCoHOye/EZUTkEBvaOzT\nSFlmGbyH8lZyUHFyNzqsDw7WHnTUOoQa+23X3HZGvaZgdlKweODcv87lobkPBc5vUUetr6/nZO1J\nZQDjCVSR+yDmWAyN8Y32D4tH5XBuQ73HTCiPRPXXEKhCxSR1ja3FgVahu40fbqTs67JA0ZvlO5u/\nfUgJ9TIgGRLXJ1IzpsaWPJ8wbQJDJg5RPThigazO2QO6tdBsJY2oh41dUwN8CAlVCYwZNoZnZz/b\nrBu62WJxblx7gy46sdZoX1lRZKfBmgvr1ygNoFtQkh4bUL66qTKKy7lXoxZPo3rX5OKvfm41F15w\nYYuE98K9fpPun8SKz1aosQUJF7IRlW+YFHguKS+J7IxsNl680Z19FYsygib99Ab12f56Em/TrKHc\n+3Mp6FnguEbd1ndj35v7yEjPYPPWzUrGJKivg8luyt6TTWb/TL8RnDByAt/7r+/RML7B8XtOiYmu\nHtCthZbs1uj0MHeTD895mPX71lM7ppa6+DpWeley74F9zd7xZaVlUVlQifAKsrOymf/X+c5mPma8\n2rpTlqgd6Th1np07d5KRmcHhA4ep9dY6CtlsC34sTimIbOASlG7QAOOYeFQrU8viXOWt4vZZt7N3\n8d4WyUf7czYWQ+d2/U5xSo1pOfBd7KGi24AllusgYPAFg5n989ls/PlGe72HuTgXWt7fG1WBPtl+\n3qpbq3hh8QuuxmFAygC73pXhkY3JGuOvGp/+5HTVS8PwZChDhe2MnhGZ/TNtyq9DJg6h4c4G+3cz\nqqW1jIYd2jhodAhkpGeQlJKkYtcuielwdny2HfRAwAt7P95rO8Zfjd1YFAhXpBLYARcYB+6AsjFl\nlMWXqYVvFWqBdFMEHU9A5M80OKOM5+qwL8T5KI8h6Bw1o2t4aO5DrHhxRbN3t0VHDdmLbc7zWq+f\nfzHugd0TML+HGT4DOAaHth9i7MNjVY2HWachUIt5cO+EWFRhmnleS7+NjdUbXbWhbJXxOfg9pflz\n5gNBRAVzXGadRz+nltXsebPt7Vyt303LaDigE9IaHQatTUyHw3oyvZQB1QOUVEMwW6cRZxK2D4F4\neogkM30IUCzN5+JRi6a1VkGgdsAu59i5f2dY3zMYpQdL7d5Q0HkPVxzGU+yhsrqS+NXxqjPeapQh\ntFBSSTT+PgZx2+Mo6V5C3Zg6FQJrQP2fQyA5bySb/Ylns0e0hcJLrjKyJjXZmiiePW82Cx5bELIf\neaj5kFqT6tq7vKSiJGSfai2j4YT2HDQ6DFqbmA6l7hpsXDLSMxh05SBKepbYwihkQ+q2VOJ98U6t\nnj4oD+Ik7gnSWJQB2UbACzkP5W1YK6mHorrZuZ0jeFELA55iDxVVFcrQJbufN6Y2JuBR3Y2NEmvK\nWcS+G0tuZi5Ve6rYs2cP1WnVapF3C70dQeUnuhjf+3KUl3AaFULrhb0dq2GkH5r7kMrhWAX+ngyd\nKA41H8ZfP97VuxqQMkB9p6A+1XGr41j93GqdjA6C9hw0OgxCUTDD3fGlxKa47hqTY5Mdx/pDLDnY\n6h7GXz+e0deMdufiH0ctiMEeh7lzNimWwc9ZaxXqUVu2YFrsRsjOyg753YKpmYveXMQF11xA5qRM\nvr7zazX+/yNgeMzzboLPvvzM4VGRSyC8tQl8I3zs+XIP733xHtWTqwMMIvNcZuhtBCpUdiXEVsSq\nxfhTlJcwFiWbXYdrZffO/TubVc/S3Pkwd9ZcMoszlXbTNiBPdfnL+2vn1VdqDSLGVhJC9AQWoPYE\nx4DHpZSvhzj2CeA3qGlitucYIqUsdjlWs5U0/GiNAJ+fjWPV99kEd112F8tfXu74nFDMIHDKT7MJ\nteAlEuh33Bu1cx5KYCFdhMp3mM8ZiFkUQ2NcowrP3IOiwW5HGRsvcAqyb84ms29meCJ5q1C5AHOH\nbkpMWFhQZvK2y/tdqP9mvfOCmTts8/9lBBLPWM55h+V6Gswf9kF2r2xKD5e6Cx5uwc7w8kLf9X05\nOiHIIwOyP1Xf200ksLnzoTXzp6MhaoT3hBCmIZiOmh6rgRuklJ+6HPsEkCml/H4Y59XGQSMi8FMj\ng5RBc0/msunlTY7jm1pIrK8lxyZTXVXNpg82Ie+ToVlL+cbnjsLRPIjDwL0oaY5hLu9dg997MdVd\nX1j8AiUVJRQfKHZfgJehduoQoOa6LdLHCDQ+sr62DbvAn/V8Jt5G0XTN63kxqnr8KHT1daVbUjfK\n7yzHgaXYxPW6re/GmKwxrn23m+rhrREaUUFlFUIkoqbXFVLKWuBdIcRK4HvA45H4DA2N1sIWKjLh\nhf4+95xFqF4Rwa+ZO3eZJF0rjCk1HmejwkZvoRhBwV3MalAL7Ic4GU934M9XFF1VxPifGS1Ke6EW\nY7ckOKiFvw8hk9EcRXkYK4GJlvHkoYzWclRB21oC3o/VsJ0ChhufEWQQT3tPc/qNEH0vknHQU+fP\nmc/eB/baPKCkvKSQPbw7U01CNCJSOYdLgQYpZZHluY9REcdQuFMIcVwIsUcI8dMIjUNDIyTCiVG3\nRFbBz4Iy25OaSEXtuFMhrjaO2Ldj1aKbgFNyYjzKKAxFSYC7LeRf4xcFrEqwLJiGRAf5BBhGXtSC\nvQm1sJcbx1g38ceM805A9aHYhpLDXoLqsjYaJWM+CrWNvMI4lyl0l4vaEm41zuVG4x2HIwcjVgnl\nHeUAIyAzRdFTTaaYyU6aWDqR+Jp42Gv5XsZ5tT7S2Uek2EpJqD2EFadQ+wM3vAH8HRX1zAaWCiFO\nSinfiNB4NDQcsEozuBWRNaefsxV+FtR1BJoGWXfgN0BDYkMgPLMS98XfFP9rwH233QO1+JphJnOx\n9OHIo+BFGZp6l9eyjc/ZQKAorRuBsFMB9hBUPKoIbguBWge3Oo54nN+rD8ozsbC+rhp4FVkxWRz2\nOH8D0yMzf4uv7/zaWVyXqGsS2gJhGQchRD5KBMAt+P8uMBM1da1IQe0/HJBS7rc83C6E+BNqmroa\nhzlz5vj/zsnJIScnJ5xha2g40FSoqKk6iKZCGH5KZSpKImIZquBLGI/NxLMZ2gmh10Ql8Kpx3Cbc\nF3VjXP4wkxcHLZRRqAX+S9xF8cyK7N6W14YSMGwhQlC9vL2oqKmgPr7e8Zr/+7p9r27YitSyKrPO\nGBJy+y3IBbYoT8Na3KahUFBQQEFBQcTOF5ZxkFLmNvW6kXOIFUJkWkJLV6F4EWF9BIF6SgesxkFD\n42wh3DqIYNgqeVNRu+UROBdJc4ZfoxKw/mpvc1c8ARWiKQcqCFQdH0EZA6v4XDzKY6iBDxlYAAAM\n1UlEQVTGdSEHQi7yXbp3of7WemVAjhHo+XwCZYTMsFTQ+Mddp3RDFnpdOsSVobaHa7Czl1aj5CyM\n44KrlkMh1G+R5kvTyegQCN44/+53v2vV+SISVpJS1gghlgH/KYT4MSqiOBHlnDoghJgIbJZSlgsh\nrkN5Ho9FYiwaGi1FS4vsgsNVsX1j2bZim7/xPVkoraTrUQvvh9CtsRunF56mcWCjOsZsgzke5RFk\nocI+Jh02MehDvdC1tCunfSESvsdRRXYurw1IGkCxt1gxi7ZiE8IjH3XXBhWKWRd1vyE0i8jejqPh\n1gZlFI8BS0DECc6LP49hlwwjsTaRSk9l2FpQEPq3uO2a27RhaCOcrTqH48CvzByCEOImYI2UMsV4\n/BowBvXTfwX8t5Tyv0OcV1NZNdoE4fRWbsk5xCqBvEqqnfAOAqEeM4wTDHNhNumkWTgX8ZXQI7YH\np6pOqYxf8AJ/NQzYPoDY3pZmRd4ADXb6k9OVmqzZnMeE+ZlD8Ut2pzeks+m1Ta40Xs8BD8VXFyvD\nYDnHxNKJrHhxxRmvlVVGfMa3Z/jpuSmxKRR6Cjl0wyFNYW0hoqbO4WxBGweNtkRri6SmzpzKwmRn\n2CU9P53a+lrKxpQFXivAvfbAoKwCKgvnQxmIfSgDcwLlaRwyXvOgCuu64K/dIBVyPbm8OOdF1+/j\nKfaQPSWbo2OcRWd+o3WGBdlT7CH7W9kc7X7U9rmgPtutdsT6XpsRNfSaGsYFpLQHfjCQYecPo4KK\nTl+wdjYQFXUOGhqdBU0lrN0QvPv94vAXcGXQQfGQcWkGEqlUXE2YrTiDC+WuN173onIBo1G5gMko\nwzEaZ7+FTahgbip+2uo+3z5mz5vtuqhmpGcw+prR7vmDQyAWCe688U6e/at7vwxzcT865miL2ESO\nhPM+AobBuGYHhx9kZOVIlv95eajTaJxFaOOgodFCuFFfk75Kgn44wiz+xdIaR09FaQm8pf6OPR6L\nb4wvUGy2Cfr17Ef85/Ec7H0woM3kIunNKFQC+xr8oauj8UdZ6F3op+MCjjDOW7PeomZ0oFMaa9R7\nZT/Jvo9D80layyZyJJx1v+eogzYOGhothNsCWXVrlavcQ6hkLrtQgnSpcO2ea+lb25ed63eCVwnt\nmb0LRn13lEoiNyHpneZLg63YQ1cGHXfkN0cG8g8WxdPfT/s9D89/GNlfqsT3CGNMKU3TeP09IlzG\nEE5ewJFwDkGD1fUM7QdtHDQ0WohQdMvBVwwmszLTtdBuw183BPpUW1lKXnvXsmBsem2T8lKyihSL\nyWUhHXzBYPZ8uUdVHllzAPFQUlOiZC6CjMZv/vwb5Dhp93SMzmjkuO/cPcUe9n6yFzKdYwiXTWSj\n/8YDWRC31p5zCJf2qnF2oI2DhkYLEYpumdk39CKfkZ4RWOiDWFFNLYRWuuy+i/exZ/UefON9tnqC\nvMY8m5idNQdAV1y9jeq06gDNNjXwvCkt7rZznz1vttI7CsqXJOUlMXdReIu5W7X6jD8ptlJzWqBq\nnD1otpKGRgvRGupra1lRk+6fxIovV6giuXqUgN63cSaXt6DkOAR22Qvz9W0ElFdz7O/LTHH/Lrn3\n51KQUWBr9YlQEt3bl2wP+ztonF1otpKGRjshIz2DBY8tYNqj0yj3lZMam8qCpxeEtcg3lxUFdmbU\nJ59/oiqFQFFiz8e9UvoUKlkNTkmONQR0j6qNYyzy2aYYXjBsciE5gfdlVmY2+3sE92fQiB5oz0FD\no4WIRNFciz8rj0ABWz7KMwjVr8GoWbhw+4U0VjVSEleiaiVGo3INXtUqc/glw8ns72wmFMnv3ZbX\n7FyHLoLT0GgnhCp4m1I5JeK9BhyfVU6g2tqsog6qfUjckMjoK0bbisgAhkwcEmBTtXDcLQ2LteU1\nO9ehw0oaGu2Elgr1ReSzjOZBaevTyLgog70f7KVqeJUyFD5IKk9i9XOrXXsjD75iMDvid7Rq3C0J\ni7l+jxZ8tkbbIFLNfjQ0zjn4Y+9WnCVuvutnJcJtN9zG9iXb2b1oN1NippB7US5Thk5h98rdroYB\nFJuqrcYdjLa8Zhqtgw4raWi0EO2ac2jFZ7Vn3F/nHNoOOuegodGOaC0ltb0+qy3HHU2ffS5BGwcN\nDQ0NDQdaaxx0zkFD4xyBp9jD1JlTyb0/l6kzp+Ip9pzV92l0bGjPQUPjHEBLY/06R9BxoT0HDQ2N\nM8JNQdZUXT0b79Po+NDGQUPjHEBJRUmL+iW09H0aHR/aOGhonANoaX2Brks4d6FzDhoa5wB0zuHc\ng6ayamhohIWW1hfouoSOCW0cNDQ0NDQc0GwlDQ0NDY2IQxsHDQ0NDQ0HtHHQ0NDQ0HBAGwcNDQ0N\nDQe0cdDQ0NDQcEAbBw0NDQ0NB7Rx0NDQ0NBwQBsHDQ0NDQ0HtHHQ0NDQ0HBAGwcNDQ0NDQe0cdDQ\n0NDQcEAbBw0NDQ0NByJiHIQQPxdCvC+EqBNCLAjj+IeFEEeEECeFEP8jhOgSiXFoaGhoaEQGkfIc\nSoC5wItnOlAIMRZ4FMgF0oFM4HcRGoeGhoaGRgQQEeMgpVwupVwJfB3G4d8HXpRS7pdSnkIZlR9E\nYhztiYKCgvYeQljQ44wcOsIYQY8z0ugo42wt2iPnkAV8bHn8MdBXCNGzHcYSMXSUCaPHGTl0hDGC\nHmek0VHG2Vq0h3FIAk5ZHp8CBJDcDmPR0NDQ0HDBGY2DECJfCNEohPC5/Nvcgs+sAlIsj1MACVS2\n4FwaGhoaGmcBEW0TKoSYCwyQUk5v4piFwJdSytnG41HAq1LK/iGO1z1CNTQ0NFqA1rQJjYvEAIQQ\nsUAXIBaIE0J0BRqklD6Xw/8XeEkI8RpQCvwGeCnUuVvz5TQ0NDQ0WoZI5Rx+C9QAvwKmGH//BkAI\ncaEQokIIcQGAlHId8DSQD3iMf3MiNA4NDQ0NjQggomElDQ0NDY3OAS2foaGhoaHhQNQZh+ZIcQgh\npgkhGoywVaXx/83RNk7j+HaRDBFC9BRCvCWEqBJCeIQQ32ni2CeEEN6g65keBeN6SghxXAhxTAjx\n1NkYT2vH2ZbXzuWzm3PPtJt0TbjjbOf7Ot64LsVCiFNCiA+FEOOaOL697uuwx9nS6xl1xoFmSHEY\n2CalTJFSJhv/t4Re2xJ0FMmQ54A6oA8wFfibEOLyJo5fFHQ9i9tzXEKInwATgSuBIcAEIcSMszSm\nFo/TQFtdu2CENRejQLqmOfd2e93XccBBYKSUsgfwH8BiIcTA4APb+XqGPU4Dzb6eUWccminF0W7o\nCJIhQohE4B7gt1LKWinlu8BK4Htn+7MjOK7vA89IKY9IKY8AzwD3R+E42w3NmIvtKl3TEe5tKWWN\nlPI/pZSHjMerUaSZa1wOb7fr2cxxtghRZxxagGFCiKNCiP1CiN8KIaLxO7WXZMilKEpxUdBnZzXx\nnjuNEM4eIcRPo2BcbteuqfFHEs29fm1x7VqDjiRdExX3tRAiDRgE7HN5OWqu5xnGCS24nhGpc2hH\nvAMMllL+nxAiC1gM1ANtGpcOA01Jhpxsw881PzuUVMkbwN+BMiAbWCqEOCmlfKMdx+V27ZIiPJ5Q\naM442+ratQbtNQ+bi6i4r4UQccCrwMtSygMuh0TF9QxjnC26nm1qjUWEpTiklMVSyv8z/t4H/Ccw\nOdrGyVmSDAljnFVAj6C3pYT6XMM9LpUK24E/EYHr6YLg69HUuNyuXdVZGJMbwh5nG1671qBDSNec\nrfu6ORBCCNSCexp4MMRh7X49wxlnS69nmxoHKWWulDJGShnr8i9SbIRWV1SfhXHuA66yPB4KlEkp\nW7W7CGOcB4BYIUSm5W1XEdr1dHwEEbieLjiAqqQPZ1xu1y7c8bcWzRlnMM7WtWsNzso8bCO09bV8\nEegN3BNC6QGi43qGM043nPF6Rl18XggRK4RIwCLFIZQ8h9ux44QQfY2/v4Gq1F4ebeNESYb8UAhx\nuRGPbFIyJFKQUtYAy4D/FEIkCiFGoJg//3Q7XggxUQiRavx9HTCTs3A9mzmu/wVmCSH6CyH6A7No\ng2vX3HG21bVzQzPmYrvMw+aOsz3va+Mznwe+AUyUUnqbOLS9r2dY42zx9ZRSRtU/4AmgEfBZ/v2H\n8dqFQAVwgfH4Dyh9pkrgC+O9sdE2TuO5h4yxlgP/A3Rpo3H2BN5CucDFwL2W124CKiyPXwOOG2P/\nBPh5W48reEzGc08CJ4yx/b82no9hjbMtr124c9GYh5XRMA+bM852vq8HGmOsMT6/0vhNvxNl9/WZ\nxtnq66nlMzQ0NDQ0HIi6sJKGhoaGRvtDGwcNDQ0NDQe0cdDQ0NDQcEAbBw0NDQ0NB7Rx0NDQ0NBw\nQBsHDQ0NDQ0HtHHQ0NDQ0HBAGwcNDQ0NDQe0cdDQ0NDQcOD/A/ncC8g+DyHXAAAAAElFTkSuQmCC\n", "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "plt.plot(X_moons[y_moons == 1, 0], X_moons[y_moons == 1, 1], 'go', label=\"Positive\")\n", "plt.plot(X_moons[y_moons == 0, 0], X_moons[y_moons == 0, 1], 'r^', label=\"Negative\")\n", "plt.legend()\n", "plt.show()" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "We must not forget to add an extra bias feature ($x_0 = 1$) to every instance. For this, we just need to add a column full of 1s on the left of the input matrix $\\mathbf{X}$:" ] }, { "cell_type": "code", "execution_count": 111, "metadata": {}, "outputs": [], "source": [ "X_moons_with_bias = np.c_[np.ones((m, 1)), X_moons]" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Let's check:" ] }, { "cell_type": "code", "execution_count": 112, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "array([[ 1. , -0.05146968, 0.44419863],\n", " [ 1. , 1.03201691, -0.41974116],\n", " [ 1. , 0.86789186, -0.25482711],\n", " [ 1. , 0.288851 , -0.44866862],\n", " [ 1. , -0.83343911, 0.53505665]])" ] }, "execution_count": 112, "metadata": {}, "output_type": "execute_result" } ], "source": [ "X_moons_with_bias[:5]" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Looks good. Now let's reshape `y_train` to make it a column vector (i.e. a 2D array with a single column):" ] }, { "cell_type": "code", "execution_count": 113, "metadata": { "collapsed": true }, "outputs": [], "source": [ "y_moons_column_vector = y_moons.reshape(-1, 1)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Now let's split the data into a training set and a test set:" ] }, { "cell_type": "code", "execution_count": 114, "metadata": { "collapsed": true }, "outputs": [], "source": [ "test_ratio = 0.2\n", "test_size = int(m * test_ratio)\n", "X_train = X_moons_with_bias[:-test_size]\n", "X_test = X_moons_with_bias[-test_size:]\n", "y_train = y_moons_column_vector[:-test_size]\n", "y_test = y_moons_column_vector[-test_size:]" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Ok, now let's create a small function to generate training batches. In this implementation we will just pick random instances from the training set for each batch. This means that a single batch may contain the same instance multiple times, and also a single epoch may not cover all the training instances (in fact it will generally cover only about two thirds of the instances). However, in practice this is not an issue and it simplifies the code:" ] }, { "cell_type": "code", "execution_count": 115, "metadata": { "collapsed": true }, "outputs": [], "source": [ "def random_batch(X_train, y_train, batch_size):\n", " rnd_indices = np.random.randint(0, len(X_train), batch_size)\n", " X_batch = X_train[rnd_indices]\n", " y_batch = y_train[rnd_indices]\n", " return X_batch, y_batch" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Let's look at a small batch:" ] }, { "cell_type": "code", "execution_count": 116, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "array([[ 1. , 1.93189866, 0.13158788],\n", " [ 1. , 1.07172763, 0.13482039],\n", " [ 1. , -1.01148674, -0.04686381],\n", " [ 1. , 0.02201868, 0.19079139],\n", " [ 1. , -0.98941204, 0.02473116]])" ] }, "execution_count": 116, "metadata": {}, "output_type": "execute_result" } ], "source": [ "X_batch, y_batch = random_batch(X_train, y_train, 5)\n", "X_batch" ] }, { "cell_type": "code", "execution_count": 117, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "array([[1],\n", " [0],\n", " [0],\n", " [1],\n", " [0]])" ] }, "execution_count": 117, "metadata": {}, "output_type": "execute_result" } ], "source": [ "y_batch" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Great! Now that the data is ready to be fed to the model, we need to build that model. Let's start with a simple implementation, then we will add all the bells and whistles." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "First let's reset the default graph." ] }, { "cell_type": "code", "execution_count": 118, "metadata": { "collapsed": true }, "outputs": [], "source": [ "reset_graph()" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "The _moons_ dataset has two input features, since each instance is a point on a plane (i.e., 2-Dimensional):" ] }, { "cell_type": "code", "execution_count": 119, "metadata": { "collapsed": true }, "outputs": [], "source": [ "n_inputs = 2" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Now let's build the Logistic Regression model. As we saw in chapter 4, this model first computes a weighted sum of the inputs (just like the Linear Regression model), and then it applies the sigmoid function to the result, which gives us the estimated probability for the positive class:\n", "\n", "$\\hat{p} = h_\\boldsymbol{\\theta}(\\mathbf{x}) = \\sigma(\\boldsymbol{\\theta}^T \\mathbf{x})$\n" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Recall that $\\boldsymbol{\\theta}$ is the parameter vector, containing the bias term $\\theta_0$ and the weights $\\theta_1, \\theta_2, \\dots, \\theta_n$. The input vector $\\mathbf{x}$ contains a constant term $x_0 = 1$, as well as all the input features $x_1, x_2, \\dots, x_n$.\n", "\n", "Since we want to be able to make predictions for multiple instances at a time, we will use an input matrix $\\mathbf{X}$ rather than a single input vector. The $i^{th}$ row will contain the transpose of the $i^{th}$ input vector $(\\mathbf{x}^{(i)})^T$. It is then possible to estimate the probability that each instance belongs to the positive class using the following equation:\n", "\n", "$ \\hat{\\mathbf{p}} = \\sigma(\\mathbf{X} \\boldsymbol{\\theta})$\n", "\n", "That's all we need to build the model:" ] }, { "cell_type": "code", "execution_count": 120, "metadata": { "collapsed": true }, "outputs": [], "source": [ "X = tf.placeholder(tf.float32, shape=(None, n_inputs + 1), name=\"X\")\n", "y = tf.placeholder(tf.float32, shape=(None, 1), name=\"y\")\n", "theta = tf.Variable(tf.random_uniform([n_inputs + 1, 1], -1.0, 1.0, seed=42), name=\"theta\")\n", "logits = tf.matmul(X, theta, name=\"logits\")\n", "y_proba = 1 / (1 + tf.exp(-logits))" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "In fact, TensorFlow has a nice function `tf.sigmoid()` that we can use to simplify the last line of the previous code:" ] }, { "cell_type": "code", "execution_count": 121, "metadata": { "collapsed": true }, "outputs": [], "source": [ "y_proba = tf.sigmoid(logits)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "As we saw in chapter 4, the log loss is a good cost function to use for Logistic Regression:\n", "\n", "$J(\\boldsymbol{\\theta}) = -\\dfrac{1}{m} \\sum\\limits_{i=1}^{m}{\\left[ y^{(i)} \\log\\left(\\hat{p}^{(i)}\\right) + (1 - y^{(i)}) \\log\\left(1 - \\hat{p}^{(i)}\\right)\\right]}$\n", "\n", "One option is to implement it ourselves:" ] }, { "cell_type": "code", "execution_count": 122, "metadata": {}, "outputs": [], "source": [ "epsilon = 1e-7 # to avoid an overflow when computing the log\n", "loss = -tf.reduce_mean(y * tf.log(y_proba + epsilon) + (1 - y) * tf.log(1 - y_proba + epsilon))" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "But we might as well use TensorFlow's `tf.losses.log_loss()` function:" ] }, { "cell_type": "code", "execution_count": 123, "metadata": { "collapsed": true }, "outputs": [], "source": [ "loss = tf.losses.log_loss(y, y_proba) # uses epsilon = 1e-7 by default" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "The rest is pretty standard: let's create the optimizer and tell it to minimize the cost function:" ] }, { "cell_type": "code", "execution_count": 124, "metadata": { "collapsed": true }, "outputs": [], "source": [ "learning_rate = 0.01\n", "optimizer = tf.train.GradientDescentOptimizer(learning_rate=learning_rate)\n", "training_op = optimizer.minimize(loss)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "All we need now (in this minimal version) is the variable initializer:" ] }, { "cell_type": "code", "execution_count": 125, "metadata": { "collapsed": true }, "outputs": [], "source": [ "init = tf.global_variables_initializer()" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "And we are ready to train the model and use it for predictions!" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "There's really nothing special about this code, it's virtually the same as the one we used earlier for Linear Regression:" ] }, { "cell_type": "code", "execution_count": 126, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Epoch: 0 \tLoss: 0.792602\n", "Epoch: 100 \tLoss: 0.343463\n", "Epoch: 200 \tLoss: 0.30754\n", "Epoch: 300 \tLoss: 0.292889\n", "Epoch: 400 \tLoss: 0.285336\n", "Epoch: 500 \tLoss: 0.280478\n", "Epoch: 600 \tLoss: 0.278083\n", "Epoch: 700 \tLoss: 0.276154\n", "Epoch: 800 \tLoss: 0.27552\n", "Epoch: 900 \tLoss: 0.274912\n" ] } ], "source": [ "n_epochs = 1000\n", "batch_size = 50\n", "n_batches = int(np.ceil(m / batch_size))\n", "\n", "with tf.Session() as sess:\n", " sess.run(init)\n", "\n", " for epoch in range(n_epochs):\n", " for batch_index in range(n_batches):\n", " X_batch, y_batch = random_batch(X_train, y_train, batch_size)\n", " sess.run(training_op, feed_dict={X: X_batch, y: y_batch})\n", " loss_val = loss.eval({X: X_test, y: y_test})\n", " if epoch % 100 == 0:\n", " print(\"Epoch:\", epoch, \"\\tLoss:\", loss_val)\n", "\n", " y_proba_val = y_proba.eval(feed_dict={X: X_test, y: y_test})" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Note: we don't use the epoch number when generating batches, so we could just have a single `for` loop rather than 2 nested `for` loops, but it's convenient to think of training time in terms of number of epochs (i.e., roughly the number of times the algorithm went through the training set)." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "For each instance in the test set, `y_proba_val` contains the estimated probability that it belongs to the positive class, according to the model. For example, here are the first 5 estimated probabilities:" ] }, { "cell_type": "code", "execution_count": 127, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "array([[ 0.54895616],\n", " [ 0.70724374],\n", " [ 0.51900256],\n", " [ 0.9911136 ],\n", " [ 0.50859052]], dtype=float32)" ] }, "execution_count": 127, "metadata": {}, "output_type": "execute_result" } ], "source": [ "y_proba_val[:5]" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "To classify each instance, we can go for maximum likelihood: classify as positive any instance whose estimated probability is greater or equal to 0.5:" ] }, { "cell_type": "code", "execution_count": 128, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "array([[ True],\n", " [ True],\n", " [ True],\n", " [ True],\n", " [ True]], dtype=bool)" ] }, "execution_count": 128, "metadata": {}, "output_type": "execute_result" } ], "source": [ "y_pred = (y_proba_val >= 0.5)\n", "y_pred[:5]" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Depending on the use case, you may want to choose a different threshold than 0.5: make it higher if you want high precision (but lower recall), and make it lower if you want high recall (but lower precision). See chapter 3 for more details." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Let's compute the model's precision and recall:" ] }, { "cell_type": "code", "execution_count": 129, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "0.86274509803921573" ] }, "execution_count": 129, "metadata": {}, "output_type": "execute_result" } ], "source": [ "from sklearn.metrics import precision_score, recall_score\n", "\n", "precision_score(y_test, y_pred)" ] }, { "cell_type": "code", "execution_count": 130, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "0.88888888888888884" ] }, "execution_count": 130, "metadata": {}, "output_type": "execute_result" } ], "source": [ "recall_score(y_test, y_pred)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Let's plot these predictions to see what they look like:" ] }, { "cell_type": "code", "execution_count": 131, "metadata": {}, "outputs": [ { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAAYcAAAEFCAYAAAAIZiutAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAIABJREFUeJztnXl8VNXZ+L8ngQAhCTsIKCQNLixVEQQXhERLoEWRolVZ\nFItVW6tsVUQURdMqwk+wvLy2L5WIC1g1aKEIARWigODOIlUsMQEEZBEwCQEDmfP7484Ms+fOzM0s\nmef7+cwnmTvn3vvcc2fOc8+zHaW1RhAEQRBcSYq2AIIgCELsIcpBEARB8EKUgyAIguCFKAdBEATB\nC1EOgiAIgheiHARBEAQvRDkIgiAIXliqHJRSf1RKfaKUOqmUKgjQboxS6rRSqlwpVWH/299KWQRB\nEITQaWDx8fYC+cAgoEktbT/UWotCEARBiEEsVQ5a638BKKUuBTpaeWxBEAQhckTT59BTKXVQKfW1\nUuoRpZT4PwRBEGIEq81KZnkf6KG13qWU6g68DpwCno6SPIIgCIILUXla11qXaa132f/fDjwB3BgN\nWQRBEARvojVz8IXyuVEpKRsrCIIQAlprn+OqGawOZU1WSjUGkoEGSqlGSqlkH+0GK6Xa2v+/AHgE\n+Je/42qtY/712GOPRV0GkVNkFDlFTscrXKw2Kz0CVAEPAqPs/z+slDrHns9wtr3dNcBWpVQFsBwo\nBJ6yWBZBEAQhRKwOZX0ceNzPx+ku7R4AHrDy3IIgCIJ1SPioReTk5ERbBFOInNYRDzKCyGk18SJn\nuCgrbFN1iVJKx7qMgiAIsYZSCh2GQzqWopUEQYgzMjMz2bVrV7TFSGg6d+5MWVmZ5ceVmYMgCCFj\nfzqNthgJjb97EO7MQXwOgiAIgheiHARBEAQvRDkIgiAIXohyEARBqIU9e/aQkZER0L+Snp5eJ47h\naCHKQRCEeklmZiapqalkZGTQvn17xo4dS1VVVUjHOueccygvL0cpw7+bm5tLQYH7YpcVFRVkZmaG\nK3bMIMpBEATLKS0rZfS40eTensvocaMpLSuN+DGUUrz99tuUl5fz+eef88knn/DnP/85aDkSFVEO\ngiBYSmlZKQPvHcii9EUUZxWzKH0RA+8dGNTgbsUxAKcZqH379vzyl7/kyy+/ZP/+/QwdOpRWrVpx\n3nnn8fzzzzvbf/LJJ1x66aU0a9aM9u3bc//99wOwa9cukpKSsNlsPPLII6xbt457772XjIwMxo0b\nB0BSUhLffvstH330Ee3bt3czQb311ltcdNFFTplmzJhBly5daNOmDbfccgvHjh0L6roigSgHQRAs\nZdrsaZRcVAIp9g0pUHJRCdNmT4voMVzZs2cPK1asoGfPnowYMYJOnTrx/fff88YbbzB16lTWrl0L\nwPjx45kwYQI//vgjJSUl3HTTTc5jOExKf/7zn7nqqquYN28e5eXlzJ071+3zvn37kpaWxpo1a5z7\nvvrqq4wePRqAv/71ryxbtox169axb98+WrRowT333BPSddUlohwEQbCUveV7zwzqDlJgX/m+iB4D\nYNiwYbRs2ZL+/fuTm5vLnXfeyYYNG5g5cyYNGzbkoosu4ne/+x0vv/wyAA0bNmTnzp388MMPpKam\n0qdPH9Pncp0p3HLLLSxevBgwfBErVqxgxIgRAMyfP5+//OUvtG/fnoYNG/Loo49SWFiIzWYL6trq\nGlEOgiBYSseMjlDtsbEaOmR0iOgxAJYuXcqRI0coLS3lf/7nf9i3bx8tW7YkNTXV2aZz587s3bsX\ngIKCAnbs2MEFF1xA3759efvtt4M6n4ORI0fy1ltvcerUKd5880169erF2WcbKxbs2rWLX//617Rs\n2ZKWLVvSrVs3GjZsyIEDB0I6V10hykEQBEvJn5RP9pbsM4N7NWRvySZ/Un5EjwF4hZ526NCBI0eO\ncPz4cee23bt307FjRwCys7NZvHgxhw4dYvLkydx4442cOHHC67gOE5I/unbtSufOnVmxYgWvvvoq\nI0eOdH7WqVMnVq5cyZEjRzhy5AhHjx7l+PHjtG/fPqhrq2tEOQiCYClZmVm8M+8dRlWMIrc0l1EV\no3hn3jtkZWZF9Bi+OPvss7niiit46KGH+Omnn9i6dSsLFixw+gMWLVrE4cOHAWjWrBlKKZKTjcUs\nXRVNu3bt+PbbbwOea+TIkcydO5d169bxm9/8xrn97rvvZurUqezevRuAQ4cOsWzZsrCuq06I9lJ2\nJpa604IgxCax/PvMysrS7733ntf2vXv36muvvVa3bNlSd+nSRc+fP9/52ejRo3Xbtm11enq67tGj\nh162bJnWWuuysjKdlJSka2pqtNZab9y4UZ933nm6ZcuWevz48VprrZOSknRJSYnzWLt379bJycn6\nuuuuczu/zWbTc+bM0eeff77OyMjQXbp00Q8//HDI1+nvHti3hzz2SlVWQRBCRqqyRh+pyioIgiBE\nDFEOgiAIgheiHARBEAQvRDkIgiAIXohyEARBELwQ5SAIgiB4IcpBEARB8EKUgyAIguCFKAdBEIQ6\n5le/+pWz8mu8IMpBiDm01sycMkUyb4WwyMzM5KyzznIrnLdgwQJyc3Pr9LyPP/44t912m9u2FStW\ncOutt9bpea1GlIMQc6xasoT9zz3H6jffjLYoQhhYoeTDOYZSipqaGp599lmv7ULtiHIQYgqtNav+\n3/9jdkUFRbNm1TooyCwjdrFCyYd7jAceeIBnnnmG8vJyr8++/vpr8vLyaNWqFV27duWNN95wfnbk\nyBGuu+46mjVrRt++fZk2bRpXXXWV8/MJEybQqVMnmjVrxqWXXsr69esNeVet4sknn+S1114jPT2d\nnj17ApCbm0tBQQHV1dW0aNGC//znP85jHT58mNTUVGc12OXLl9OzZ09atGhBv3792LZtW0jXHi6i\nHISYYtWSJQzetg0F5G3bxh+GDw848MssIzYJVsnX1TF69+5NTk4Os2bNctteVVVFXl4eo0eP5vDh\nw7z66qvcc889fPXVVwDcc889pKenc/DgQRYuXMiLL77oNuPo06cPW7du5ejRo4wcOZLf/OY3VFdX\nM2jQIKZOncrNN99MRUUFX3zxhdt5U1JSuOGGG3j11Ved215//XVycnJo3bo1n3/+OXfccQf/+Mc/\nOHLkCHfffTdDhw7l1KlTQV97uIhyEGIGx2CQV1UFwOCqKg4sW8aqJUsCtg9n8BDqBlclP2jbtpCU\ntxXHAMMHMG/ePH744QfntuXLl5OVlcVtt92GUoqLL76YG264wblc55tvvskTTzxBo0aN6Nq1K2PG\njHE75siRI2nevDlJSUlMnDiRn376iR07dpiSZ8SIEc4lRAEWL17MqFGjAHj++ef5/e9/T+/evVFK\nceutt9KoUSM2bdoU0rWHgygHIWZwHQwAFHCXzcYLDz2EzWbj6SlTePrBB51KwKrBQ7AWTyU/qKoq\naOVtxTEcdO/enWuvvZannnrKuW3Xrl1s2rTJuVRnixYtWLx4MQcOHODQoUOcPn3auawnwDnnnON2\nzGeeeYZu3brRokULWrRoQXl5udMsVBtXX301J0+e5JNPPmH37t1s2bKFYcOGOeV65pln3OT67rvv\n2LcvuLWzrUCUgxAzbNuwgQ9792b6gAGM69aNcUqxCWhdVsbTkyfz3dy5fD53LqvffNP04CE+icjj\nS8kHq7ytOIYr06dP5x//+IdzrehOnTqRk5PjtlRneXk58+bNo02bNjRs2JDvvvvOuf+ePXuc/69b\nt46ZM2dSWFjI0aNHOXr0KBkZGc7vWG0Ob6UUN910E4sXL2bx4sVce+21NG3aFDCU0MMPP+wmV2Vl\nJTfffHNI1x0ODSJ+RkHwwwNz5gDGgD7p8suZrTUK0KdPM/zvf2fJiRNMAlbOnInNZvM7eAy64Qbn\nMZ0+iUsvddsu1B3bNmygsndvNroMklpr0tavN30PrDiGK9nZ2dx8883MnTuXCy+8kCFDhvDggw/y\nyiuvcMstt6C1ZsuWLaSnp3P++eczfPhwp0LZtWsXL730Ep07dwagsrKShg0b0qpVK6qrq5kxYwYV\nFRXOc7Vr1453333XWE3Nj6IYMWIEw4YNo3Xr1vzlL39xbr/zzjsZPnw411xzDX369OH48eO8//77\nDBgwwKlAIkY4y8h5voA/Ap8AJ4GCWtpOBPYDR4HngYZ+2vldHk+on6x84w1dlJqqNThf/wZdBHol\n6CdTUvQ9Q4boR/v3148NGOB8Pdq/v545YYLzODabTY/v00fPAD2+Tx9ts9mieFX1k1j+fXouE7pn\nzx7dpEkTffXVV2uttf7mm2/0kCFDdJs2bXTr1q31Nddco7ds2aK11vrQoUN6yJAhulmzZrpPnz56\nypQp+he/+IXWWuuamhp9xx136IyMDN2hQwc9a9Yst3P98MMPul+/frpFixa6V69eWmutc3Nz9YIF\nC9zk69Kli27durU+deqU2/ZVq1bpSy+9VLdo0UJ36NBB33TTTbqystLvdfq7B8TSMqFKqWGADRgE\nNNFaj/XTbhCwEMi1K4h/ARu11lN9tNVWylgf0Voz66GHeOCpp+I+hltrzQ19+tCjSROSkpLQWlPy\nxRf8rKKCdOB+jKcK+vRhzqZNAa+3qLCQL0aN4mB1NW1TUrhk8WKZPVhMoiwTOmXKFA4cOMALL7wQ\nbVG8iItlQrXW/9JaLwOO1NL0NmCB1vprrfWPQD7wWytlSSTqUzjnqiVL6LxjB1eOH8/04mIuv+8+\nRlRX8wTwAIb5aDDQbvPmgNertaZo1iwOVFczGzhQXc3KmTMTYiATwmfHjh3O/IKPP/6YBQsWMHz4\n8ChLFVmi5ZDuDmxxeb8FaKuUahEleeIWXY/COX1dy7YNG1jSujVjMjK4vVkzbm/WjFebNeOTNm3Y\nak888sWqJUtot3kzv8S8QhEEBxUVFQwfPpy0tDRuueUWHnjgAa677rpoixVZwrFJ+XthzAT8+hyA\nnUCey/sGGOaoTj7a+rW1Ce72+ZWpqbqosNBnO5vNpp9+8MGI2N1DOZfNZtN3X3+9XmniWszw9Pjx\n+ob0dG2z+yxsoG9IT9fDevUS34OFyO8z+vi7B4Tpc4jWzKESyHB5nwFooMJX4+nTpztfxcXFERAv\nPtDafCy4q+lJ67oN7wzFzFVUWMiBf/+bQRbEtQNc2K8fd9bUuEUz3V5dTcr27TJ7EOolxcXFbmNl\nuFjqkHYeVKl8oKP275BeBHyrtZ5mf3818IrWuoOPtrquBrF4p6iwEDVmjHNABShKTUW99JKb41Vr\ne2joRx8xqW9f8v70J1bfcQeDX3jBcget57lmb9xYq5Nca80t553Hb3fuZLDr9fm4FrPMmjiRys8/\nd55ba83OL77goooK9puUS6idRHFIxzJ15ZC2OlopGWgIPAqcDdwJnNZa13i0GwS8AFwDfA8UApu0\n1g/7OKYoBz94DoBgjwW/5BJnzgC4K5GVTZrw+tlnU/Df/5oevIPB9VxmB/eiwkL+PWIErU+f5qhS\n0LUrLdu08XktkZRLqB1RDtGnrpSD1b6GxzB8BzUur0eBczBMRme7tJ2AoRiOIXkOdYbNZtMT+vZ1\n2t5XgF6alBTQrh+qf8LzXDYw3gc4Tk1NjR7Yvr2u8bGPVX6SUOTydYxI+Wziic6dO2sMk7C8ovTq\n3Lmzz3tDmD6HOnFIW/kS5RAerg5rG+gJ9r+BBsmVb7yhJ6SnB+0Q9pW8Vptj+clJk/S9GAlunvuE\nKocvuVYGKZevY1ghiyBEinCVg5TPqOe4liHYdegQw7/+GmWzAb5LTmh9Jpx00qxZ5A0fbtrs5Fny\nQGvN56WlXLVunU8TjtaazYsX80/gpvR0PuzZ0zlFbrpuHfs3bQpJDl9y7c7M5JWvv6blBRecMVmZ\nLMVgs9l4Ydw4/mmBLIIQN4SjWSLxQmYOljFzwoRaS06YDY01Q21P24HOZaUcrmYl15mSWVPRk5Mm\n6aV+ZhyOY9TU1IjZSYgpELOSYBW+bPPj+/bVMyZPDsv/4Mt0FcgPYIWPwBV/isaMqaimpkYPa9rU\nryyOYzz5pz+J2UmIKUQ5CG6E4zj1WfCuUSN9U+PGYfkffD35B/JPhOK78Ic/RVNTUxNQeTl4ctIk\nvdxFDldZXI89rGlTXROmEhMEKwlXOdRJnoOVSChrcBQVFrJq7FgGFRSw9dNPgyrGZ1VugNZnch0U\nRkiFZ9hsoDBcwFSIrhn85YJ88Yc/cMnf/hYwtFVrzbXt29PrwAGS7NdRkp7Oz3r2JP2SS/j5lVc6\nj70cI4ZbS5isECPEVChrXbyQmYNpXJ9kb+rSRY8P08wRqt3fyif/cPHlZ5l21VX6prPOMhW15e86\nfM5IwDl7EB+EEG2QmYPgwPGUnFdVxVilKNCaiX360D43l8lBlvPWJp7+/WE2OS9amM0sD3QdrrMG\n5zEwIsC0fWZycP78kGZwgmAFMZUhXReIcjCH62C+yr5tMPBkSgr7k5O59uWXgzJ1mB1A4xFf5rM9\npaVcMHw4k599Nqhj7CkpoaaiwvghAhVpafT42c/Y8d//8s/vv2fsuefSbP9+frlwYdz3mxBfiHIQ\nAPdZwyRgtn37RGAOMLFvX+YEUSoj1p/+wZDHikWOHH4aq2pNuSrWpUlJNLLZWCX1nIQIIz6HOMbK\nkgwO2/rt3brpZY7yGC6Zx/9u1MhpK68vtnArspZrC7kN53iuvogVTZpImKsQUQjT5xCtkt0C1q7g\n9sCcOTz+/vt0y8vj0379ePSqq3i8YUMG2j8f8tNPFM2aRVFhYb1YNU5r84scae2/RPmqJUsYvG2b\nW7Z4OKxasoRBW7cyC8NPozDWzFUnTsT9YkxCYiHKIUoEM7gFg0NJNLn0UnqfOsU79u0KyNu2jYVT\np9aLVeOCGdT9KWHHPTCzHoZZtm3YwGuZmXyblMTNwHRgI7DNhJyCEEuIcogSVj6xej4Za23ULJoL\nPJ+ezmP9+zN9wABWZGbSuqzMsqfkaBHMoB5ICbveA8CSfrl/9myaZ2TwN5sNW3o6un9/1IABHB8w\ngI29e7Nl3Tqve1WXCy8JQsiEY5OKxIt66HMItTyEP3+Bp+3dV36C1SUpokkweRSBcjXM1JoKRTZH\nBdgVfjLDPe+VlN0Q6gKkfEb8EWqSmK+BxNOh6loWwlUJrHj99ZhJTAsXs4N6pBVibecLdK/iVVEL\nsUu4ykFCWaNAKGGiWvteftNzhTPXshAOilJTWZqbS1t7PL7Zc8Y7kc7VKCoshDFjGOxyvpWpqSTZ\nzxfoXtWXHBIhdpA8hwTB1zKXecOHe2UxX9uuHb3OP5+kBFIC/oh0rsasiRPZvXo16quvaKk1R5RC\nd+1Kp7w87p892+1e2YAbmjblzePHnfduYt++tB8wgMkzZkg+hBA2kueQAPgzV9QnU1F9IJBZydOU\nuBK8qr2GWgFXEHyBmJXqP/7MI4loKoplApmxtq5f7zaL2bpzJ+kVFSSnp9OpSxe0Dq0CriD4I9yZ\ngywTGgd4Lr8JhhL42bnnihKIIfzdp7T162u9T0WFhVwxZgyDgCKPpVsFIRrIzCHO0NqaekJC9HHc\ny/uffJI/XXFFSBVwBcEf4c4cJAkuzrCy5IYQXRz38unJky1PxhOEcJGZQxyh/YSzCvGH672UCDOh\nLhCfQwJRVFjIic8+A848WYpdOj5xLd1xX0UFatw4y+6lmB4FKxCzUgyitXe9Ha01C6dOpfHp06zG\nmiJxQnTQ2vqCf66I6VGwAlEOMYivH3dRYSGpJSXMwViOEsQuHa/URcE/Bw7FUx8q7wrRRZRDjOHv\nx71s4UJ+rRQKyE1K4o5u3djYuzdb16+PrsBCUGitmf/002zo1YvpAwY4X77upa8ZZG1YvT6FkMCE\nk0EXiRcJliFd3yuqJjrBVGENtmKrv+9JTU1NvVn9TzAPshJc/UH7sUUXFRZKqGM9wHF/za5eF6x5\nyJ+56unJk8UHIQSNhLLGCFpr/vDrXzPsnXfcqnoWpaby79xcWkuZjLjHV/FEfxFKrhVeXSu7BsJX\noUGb1ny2YwfLDxyQ8OcEQ6qy1hOKCgv5n5Ej+dm559KqTRvndlEC9QPtktfgmgX9zIcf8v+mTnUL\nO/XXNpSBPRiFJNQvpCprPcBzERibzeZz1Td/K8EJsY+/BZ6e/NOfvPwKrqvJOV4rUlP1yjfeCOr+\ni68qsSFMn4MkwcUAviJMtNaGnfjSS51Pes4QV5dtiYzW8ZPs5ason81mY8eiRfyzooKJs2ax+eOP\nmTxjBts2bGB3ZiYrXNeFyMzE9sILpKxbZ/r+BwqZle+PUBtiVooy2ocJYWKfPqAUc1zKZABSOsOD\nosJCVo0dy6CCArZ++mlcKAlXXE0+yxs14mWlGPvKKz4XcZrYty9ozZyPPzZ9/yO92JEQW8SUWQlo\nAbwFVAKlwAg/7R4DqoFyoML+N9NPW0unWuFitWnHl7nhyZQUvbxRI6fpoaiw0GeIayLjajK5qUsX\nPT6IkM9YwJfJZzzo8X36uC3iZAP9NOhlKSn6qZQUuf+CaYixUNbngJNAG2A08DelVFc/bf+ptc7Q\nWqfb/5ZZLEud4Ct7Wevgk5UcbNuwgQ9793YmQz3Wvz+fNWrEr376CTDCWVfOnElRHZZbiEccJhPA\nyByPs4xgXyafwUC7zZtZ9uKLzu/E2G7dKE1O5rmkJJKqq4Ez999ms4X8vROEWglHs7i+gFTgJyDb\nZdtLwJM+2j4GvGTyuFYp0rDx5TjW2nyykmPW4ZqU5DkTqW0mIcuBut+HlfaXw2kbL30yc8IE/Wj/\n/vrR/v31qPR0/SjoR+2zBNegBMd13q6Uc5YRyJktCA6IIYf0ecBprXWJy7YtQH8/7a9TSh0G9gP/\nq7X+u4Wy1Am+HMd5w4c7k5UmzZpF3vDhfm3Bzvr9p09zcP58Vl96qZfj2ZfjcuvOnfwXWNWsGUe+\n/pqWF1xAi9atSVu/PiEdi66zhlXAbPv2wVVVtd6DWMFh83ddAc5BkUtQguP7Nkwp7ujalU72MGdX\nZ3a8XLMQZ4SjWVxfQD9gn8e23wFrfLS9ADgLYzZ9ObAPuNnPca1WqCHhLyzQ1T4c6Gnedf9hTZvq\nGrt9ebyPmUht+yd6OKLjqfv2bt30sqSkuJ5ROa7lsQEDnK9H+/fXT48fHzAMNRQflIRCJxaEOXOw\nUjlcDFR6bJsELDWx74PAG34+s7rPQsKXuWdFkyb69nPPNRVH7rr/v0EX+XE8mzl/vA2AdYW/gXXm\nhAnRFi1s/OVFhFNrK9haTUJ8E65ysNKs9A3QQCmVrc+Yli4CtpvYVwN+58TTp093/p+Tk0NOTk7o\nUoaIL3PProMHGf7f/9YaR661vU6O3aE8BJgI6OpqptjbDApgEvHcP1DbRKI+h2P6+r5prUlbv97N\n3ATm8hec3yExQ9VbiouLKS4utux4luY5KKUWYwz0dwI9geXAFVrrrzzaDQU+0FofU0r1Ad4Epmit\nX/FxTG2ljFZiNo7cNZ7dwXLgS3AqB8BveQNf+ydyKQSt4yf5zWq01tzQpw89mjQhKSnJbXug/AUp\no5F4xNoyoX8ECoCDwGHg91rrr5RS/YAVWusMe7tbgAKlVArwHfCUL8UQ65h9cnU8Bb5WUkKNvYBe\n5YkTHAO+atyY5PR0OnXp4nwy9PzRBnqKjOcfeKiDfCJniq9asoTOO3Zw5QsvmL52mXkKIRGOTSoS\nL2LE52Al4hg0CMUGnsiO+VCvPZD/IhH5tvRbPeq+UTpnTI4edd8o/W3pt9EWqU4gxpLgBBPIGr+h\nrVcAib3SWajX7plo6W/luUSgtKyUgfcOZFH6IoqzilmUvoiB9w6ktKw02qLFHuFolki8qGczh0R+\n8nUl1FDMRK0ymsjXbiWj7hulmYpmustrKnrUfaOiLZrlIDOH+CKRn3wdaO17xTtdy+whUJXR+o6v\na89LkGu3kr3leyHF/uYYUAxsgHc3viuzBw+kZHcEcQyKie4YDLWUdH11zJvB9dqPHDrEka+/pkVm\nJp0S4NqtJCM5wyj5WQV8BOQCKXCg+gAD7x3IO/PeISszK7pCxghSsjuCSEiqgZSSDh2ttZRuD4OB\nNw7k3W3vQhIwnDOzCIBqyFybSeZ5mXTM6Ej+pPy4VhSyTGgcIYOiEC6B8hW0Ttz8DzOUlpXS7Tfd\nOJl3EjZgzBo8eQ/oBXwGjSsbk9czj2enPRuXSkKUgwXIj0qIB1xnDb7WlnYsfjTYJQeitKyUabOn\nsbd8b714Gg6H0eNGsyh9kTFbKAauwGvmQDFGx9rNTVRD9pbsuDQ3hascxCGNhJYK8UEgX43Tn+US\nGixhm+64OaMvBtZiKATsf9cCNs4oBoy/JReVMG32tIjKGgskvEPa9UeViM7heCDRZ3aOp/9v3i0m\ns3kaa37egyaNmwC+6y05FMbL779FyUUlPge6V+bGXUGCsOmY0dFQAilAc6AvsA5aVbei8anG7E3Z\naziqP8RQHs3tO6bAvvJ90RE6iiS8cvAVWppIzuF4IJHLZTie/ksuKoGb4ZNqyN6S7mbmcJqbPKLg\nvju/MbTyOGCCDnQA+ZPy2XTvpjMKMxWyM7IpmFLArfm3Qm+cpiTWYiiP5sb7Dhkdoih5dEhos1Ko\n8fZC5PBlLkkkps2e5vfpHwzl8ashAxjw6Sde5qYmh2xnzCYOEnSgA8jKzOKdee8wqmIUuaW5jKoY\nxTvz3mH+6/PZ3Xu3Wx+TC2zG6XPIn5QfPcGjRELPHEKNtxciR6LP7PaW7/X79O+YVVQdLmHO2fCs\nDZocb8yF515I40aN6dU+m/9u2XdGuTgGunmJN9A5yMrM8jKp+evj5lXNGVIxhPx5ienET2jlkMhJ\nVfGAJA162Mkd2J/+XWcV+52fnaRVxbnOAfAOu79iX/k+OmR0SIiBLtgILX99PKTvkIT0zTiQUFYh\nZpGkQQ+fg0do5djpYynOKvbaJ7c0lzUL10Rc1lggUH/5UxCh7BMPSChrjKG1ZuaUKQlnG68LpJqo\nfzt5VmbWmSdeVxLYpwC1+2h8EaiPExmZOViMr0QkQagL6usTbzjk3p4rsyk7MnOIIRI9skaILI4n\n3qHfD6UcJ6llAAAcvElEQVTd6na0Xd2WHq17mN6/tKyU0eNGk3t7LqPHja4XyXEym7IOmTlYiKzT\nK0SaUGcP9XXWUV+vKxRk5hAjhJMzIX4KIVRCsbGHs1+sI/4D60joUFYrCbQYS22zh0TOABbCI1Ae\nRF3sFw/4ymUQgkdmDhaxdf16XmvVisf692dct26MTkpiRWZmrZE14qcQwiFUG7vY5oXaEOVgERf2\n60ezY8e4/L77SE5P52WbjQbp6dw/e3bA/WTZUN/UZmoTU5xB/qR8srdku1UXNVPuwbnfIYwy1e9B\n2r/TuOumu+pYYiFuCGcB6ki8DBFjG9fF328/91y9skkTbQN9d4MGeuUbb5jaTxaNd2flG2/oCenp\nuqiw0Oszm82m7x42TI9PS/P5eaLxbem3etR9o3TumFw96r5R+tvSb03t9/6693Va3zTNVDTT0UxF\nZw/JNr2/ENvYx86Qx16ZOViA69P/sJISOHGCVUDj06d54aGH/D7dBqrtlMjoWkxtRYWFqGXLGFxZ\nKaY4FzTu/VBbqOr81+dTeU1lvXNKC9YgyiFMHAOZI0ppqM1GEVAEzAFSS0pYtWSJz30lA/gM2sVM\nFMjUprXm9Ycf5jmbjVVA3tatCa1M/S3o88H6D2pd6Mdt8RsH9cQpLYSPRCuFia+n/xylKLA/zQ5T\niqULFzL4xhu99pV1o89QVFjIu7Nn8/NevVj9zDM8U1XFTOD+qir+5FJsr6iwkF+XlBiKA+DECYoS\nrBifK/5CUsdMHkNZblnAhX4CFfUTBJk5hInn0/9j/fvzUoMG1ACrMWYSjQ8fxmaziQPVD1prFk6d\nSrdTp5h7zz0M+PQTVmNUGn0H96UwX3/4Ya6z2QBDOST67MHf0/+xmmPGqmbFGAvXFANV7rOCUJ3Z\nQmIgM4cw8Xz6f/u11/jigw9YDkwC8jAGt6cnT+bg/PmSy+CDosJCUktKmAOMOXyYP7eAVsehqBp+\nmaz4effutF2/nu8PHuD6nTvdZmm5SUm8nplJtwQts+7v6T/NlsaxTcfgas6sbrYGMs7PcDZzJIwl\nWklvwRxSPsNirsnKYlxZGdcD/wIWn302F/zsZ3y2YwfLDxxgUt++zN64MSFNIL7QWnPLeefx2507\nGQysBB5oCo9Xww2noLAhLPjFVTz33IsMye1Jm5ofUUmAfWGbn3e5kDZXXJGwJjp/5SIym2byXpf3\nvJTG0O+HsnTB0miJK0QQKZ8RQ9TU1NBo926G2t9fD5w6epTL/vhHxlVUeDlYXZ2wiYpj1jDI/n4Q\nkHUchp8y3t9wCmyfbOGRZx7hq5E/8sEd8P5v4f07oOj3J9nX91xuHD+u3hWQM0tWZhYFUwrIXJtJ\n8xXNyVybScGUAmqa1LgrhmPAh/DBlx8kXB8JoSEzBwu567rruH75coa4bFsOzG3XjlUHDqAADc7Z\nw6olSxK+vPcfhgzhl0VFDLX7EYqA08C1Lm3+1SCZp/qdz8c5//Ha/7Jtl3Ho5KGELbTmb+bQvV13\nlp21zNh2DPgIY13kBOyjRCXcmYMoB4vQWpOblkaPqiqOAS0xHKpaKY4DK12uoSg1FV580Yjl/+ij\nhDY1zZo4kcrPP+frnV9zgINUHYL2Ck40gurmkHwsiV5de/PpTz+ydsgOLzNJ5tpM96gc+/ZRFaMS\nor7O6HGjWZS+yOv6r997PV8e/tJQGh8CV5CwfZSohKscxCFtEauWLOH+06d5D3gZwxk9FligFF0u\nuIDpbdo422qtOfzCCwz1iOVPxNmDw1fgfALOLIHtQA2kHUvj7cVv079ff79PyG06taEspcz9oAkU\nq++vgF455U5n89tVb3Ms5ZhXm0TpIyE0xOdgEds2bOD/mjYlByOKZgDwv6mp2Nq3p1NeHtOLi91e\nKT/8EFJ57/qKs9Ry0ihyO+cy6uJRbF22lf79+rt/7lKKuWBKAd/v/j6hC8gFKqDnqE46pO+QhO4j\nX9THhY6sRsxKFqG1ZtLllzP7o4+8fAue5iLXRYGc22RxoKBwm2l8jk97OhhJYnvL99IxoyP5k+pf\nmKaZxW1kARx3EqU/YsrnoJRqARQAAzHqPU7VWr/qp+3TwB0Y42iB1vpBP+3iQjkEM+A77OyuSkNr\nTdollyRsSGawuNnajwGbgZ+g6Q9N6fHzHpyVdhZf7P+C3efudjdTPfe2czZSXygtK3XPVfChBM20\nSRT8+Wnqmw8m1pSDQxGMBS4B3gYu11p/5dHubmACRooOwLvAX7XW830cMy6Ugwz4kcVrIXnPiJz3\ngAvxmlWkvZfG1n9uTdiBUfDx3XFsL81lzcI1te7vULSxPiONGYe0UioVGA5001qfADYopZYBtwJT\nPZrfBjyjtd5v3/cZ4HeAl3KIF0QBRBavzODNnFECYHjTtntsS4HKayqZkD9BEsESmHBqSrmZpFoZ\n+226d1O9M0mBtQ7p84DTWusSl21bgO4+2na3f1ZbO0HwiVddoOMYIZuOOkI/ATX4rDu0+ovVCeGA\nFKerb/zVlLrrprtq7a/6uva2L6xUDmnAjx7bfgTSTbT90b5NEEzhGr102bbLaFDdwIjlz8X4q4F9\n+IzSOZl2sl7+mF3xV8pbFIT/yLexM8bW2l+JVObcSuVQCWR4bMsAKky0zbBvEwTTOEI1sztkc3rI\nabenOa6BRqoRScuT3J4QWQv0qp8/ZlcS6Qk3FBzfnTUL1/DK3FeY//p8U/2VSGtvW5kE9w3QQCmV\n7WJaugjD8uvJdvtnn9rfX+ynHQDTp093/p+Tk0NOTo4F4gr1BX+JYD179aRd43YsXbfUeAxSQF8g\nFTrU1L8fsyv++qS+K8VQMdtf+ZPy2XTvJq8w2Px50S9zXlxcTHFxsWXHs0w5aK2rlFJvAk8ope4E\negJDMSb5nrwETFJKrbS/nwT81d+xXZWDkFiYiQzx52DMbmusTfDlvV8G/DHHS/RJMMhCPsFhtr9i\nucy554Pz448/Htbx6jLP4TDwoNb6NaVUP2CF1jrDpe0M4E4M6/A/tNYP+TlmXISyCtZjNlmptnaB\nYvzra0JUfb2uuqI+9ldM5TnUBdFSDlprZj30EA889VRCFsSLBYJJVgo1yas+J0RJ4ltw1Lf+ipk8\nh/rGqiVL2P/cc7JyWxQJxm7ucDDW5TnijVD7JFGR/nJHCu/5QGttlNOuqEj4gnjRJBKRIYkUfSII\nwSDKwQerlixhsEc5bQdaa55+8EGeTvAV3CKBv2Sl/EnWRYZE4hyCEI+Iz8GD2qqrFhUW8sKtt9JW\nKa59+WUxOdUxkbAD1zdbsyCAOKQtJ1B11bzhw5l42WXw8cfMASb27cucBF3BTRDijfoYshwIcUhb\nzLYNG6js3ZuNntVV169Ha027zZu5BCOf6hebNyfsCm71jXgZOOJFzroklD5IpIJ5ViEzB5Nord1m\nDQ6Tk8we4p94iXGPFznrklD7oD6HLPsj3JmDOKRNsmrJEtpt3swvMRQDuM8ehPglXuoQucl5DPgQ\nSspLuHrk1QlTUC/Ue+WvYN67n72bMH0XLKIcTLJtwwY+adOGxc2acbv9NSYjgzdbt2br+vXRFk8I\ng3iptOmU07Gw0RXANVCWWxazFVeDLRteW3u3e3UMozz7Bnh3Y+BB3l/I8oHkAzHbd9FGzEpCwhOs\nySFadn+nnB9iKIYYN5EEawIy097ZB1W4r/wXwrFZi7MQY6z1nRWIWUkQwiSYXIdorpPglNPPIkax\nNtMJxgRUWlbK1SOvrrW9sw8+w2uVv0DmJUfBvLar2xpK4UMMxdCcmOy7WECUg5Dw+Fr8xd8TaDT9\nEw45M09nxkVWt1lznUPhljUoq7W9c5CvaRu0gszKzGLg5QPhSiAHQzFATPZdLCDKQRA4U1dnwfQF\nAIydPrZ2m7eDCD55ZmVmsWbxmrjI6jZbmsSpcJMx1T4rM4uBvQaGpCAlI948ohwEwY4Zk1Es1GIK\nZqYTTcwOxE6FezGGycfEwB3qIB8vfRcLiENaEOyYcUxLrkFwmClN4tbvx4DNQA1kns5kzeI1fvv1\ng/UfMGbyGI7VHKN5cnNenPki/fv1r/NrihekfIYgWETu7bkUZxV7by/NZc3CNc73UovJWkJRuGYX\neErkTHJRDoJgEYmYRRsrBKtwA92r/En5MrtDlIMgWEYocfmJ/nRqNWb7NNAsr0NGB1HyiHIQBEsx\n+wQrvgfrcPT5zn072f7ddiqvqTT69BCkFqfSpGkTkm3JXN79cuZMn0NWZlbAmcPe8r3eiuMYtFvf\njq7duyaMIhflIAgWEcxMQExQ1uCmZF0zvx0lQlwyoFkD5zQ5h/cL3gfwq5ynzZ7mfm+OAZuAq0ko\nRS4Z0oJgkkB1e4LNfK6rfIdgaxHFO25JhZozfboZQzFU4ayfREPYU7mHabOnBQxJ9Qpz/YwzigFi\ntrBirCHrOQgJQW31/ANlPvuaCTjzHTxmDuHkO8TamgOR8KnsLd9rXCsYZY4dfarxWT+JFVCyrwQ4\nk7joiUNxOMyD22u2czDloHsjKZlRKzJzEBKC2speBDsT8JWEdc7Gc6g4XhHyU38kS3PUNkOJVA0p\nt6RCRxLcIeAgsAbj8dWxKGMK8Cv4ft/3tR7XoTjWLFwTcjZ1oiPKQUgIahv8M5IzfA4g6cnpPo/n\nada4fu/1qIaKZWctC3kwjVRpDjMDf6QUlZuSbQ50AbVOwa+B4cBVGLOHY2fkOKvTWaaPX1pWSsXx\nChqvbAzv2Y8jJTNMIcpBSAhqK3uhapTxpOoyE2CNsd3fU7br02laRhq7e+8OejB1PXbZN2XGU7Mf\nGT3bh+qTMDPwR0pReSrZzJ2Z6Gu1m2zkYvggwBjY22a7HcNfnziU4LKzlnHy+pNwFTRZ34Tr915f\n753RViA+ByEhyJ+Uz6Z7N3lFt+TPM54ef+RHuAwjYkZj2L8vgwN7DpjyA7jZzh3UMph6+Rg6QoOi\nBpy+/DS08ZbRKp+EGVmD8amE65tw9R3k3p5LWUqZl2xovPrDcW5/feJLCZ7IO0FaRZooBhPIzEFI\nCGoruNYxoyOkYpRyzrX/TTXs22bMK6EU5PM1eJ0efJrMzzN9ymiVqceMrGYL21ntm/AnW7vj7XwW\nyQvUJ9GuoBvviHIQEgZXM9Arc19xG2T8DYZndTrL1AATSpVQf4NX1nlZPmW0arAzI6vZ6qX+BucJ\n+ROCkqk22Ta+sdGrPyBwn8RCBd14RpSDIOB/MMxum21qgAmlFLS/wSs9Od2nDd2qwc6srIGUqQN/\ng/PqL1aHNHsIth8D9Yms3RAekiEtCAEItWKoGRu8r2N3+rQT+pRmT7c9sB2ogbRjabz93Nucc/Y5\nMVeyw1+mOOtg1MV1ny1utjprIlbQlfIZglDHBDPAhFq8z3HsyvJKlqYthc9xS/5Key+Nrf/cChBT\ng11pWSndftONk3knzySqrQX6Qu5R91LndSlDLPVJrCDKQRBiCH9P0plrAy9c4yD39lyKdxWfqTHk\ncoxYrds07PZhLP12qWGkVhjJbKmB5ZWKtnVPuMpBQlkFwUL8hYmWNShj4L0Dzfkhaoh4lI1jsC45\nWML3u7+nXYd2dOnQxdSgPWf6HL6890u/YcK+zuUv/BQQpREjyMxBECzErw3eXnG0tqf/0rJSevyq\nB1W/rorYzMGXKYy1wCWQXWbOpxHItOM5S6gsr2Rpx6Ve1zf0+6FsP7A9pnwq8YyYlQQhhvhg/QcM\nuX/ImTUJXGzwNPdectST0rJSrrztSvaf2u9WYrrTp50o/r/iOhkkw1VogfCleBqvbszJfieNchku\ntFvdjgM5B+LGnBbrxEzJbqVUC6XUW0qpSqVUqVJqRIC2jymlqpVS5UqpCvvfTKtkEYRoUFpWytgZ\nY6nsXQlvYdTy+RCnYjATdjpt9jT2X7UfumEc403jdXLvScZOH1snZbz9haM6SmiHY87ylQdxMu+k\nUUbblWrQ1Tri5jTBP1b6HJ4DTmIk/l8CvK2U2qy1/spP+39qrW+z8PyCEFXcBsJBGAXjrsKnHd6f\nQ3Zv+V5IBr7CKD5n3/fgioMcbHEQUq0v4+2vVIajhHYweRSe11VysMSnD6ZxZWNOVp9065se3Xuw\ntNrb3CRJa9HBEuWglErFqKHYTWt9AtiglFoG3ApMteIcghDruDmjm2PMGD6E5lXNGdJ3CPnz8p2x\n9/4csh0zOhpP1Y4wVnCWquZDICfwOhOhkD8pn6W3LHU3hb0H2CBpWRIVfSooLSs15XfwvK604jTI\nxmvAz+uZR3pF+hkfhV1pBuPYFuoWq2YO5wGntdYlLtu2AP0D7HOdUuowsB/4X6313y2SRRCigtcT\neHPgChhSMcRtIA9UDyh/Uj5Lhi/hZMpJ94OnAEcw/BcKSlqd+alZUfiu+9nd+ejDjwz5D2OMDMPA\nlmJjWfUytt+7vdbZiq/rquxfSdp7aW6KJ3tLNs/Oe9bnsVwX6XEoDXFGRwerlEMa8KPHth8B38Xw\n4TXg/4ADGLUwlyiljmqtX7NIHkGIOLVVfnUQqCpqVmYWeT3zWFa9zNvM0wxjRlENX773pdP3YEWl\n1i4duvDR+R8Z5yzGPc8ixdxsxed1tYEeZ/cguyLb1IDvb3U3IfKYUg5KqbXAAAwXlScbgHEYX11X\nMoAKX8fTWn/t8najUuqvwI0YSsOL6dOnO//PyckhJyfHjNiCEFE8l6f0NxDWVg772WnPsv1e95BO\n1mA8RmHsV3lNpbMaazDLm/rDTbG5ruXswIRj2N91ZXfIlgE/AhQXF1NcXGzZ8SwJZbX7HI4A3R2m\nJaXUi8BerXWtPgel1GSgj9b6Rh+fSSirUK8wU2LDNW9g69at/DDgB6/Qz9zSXDSa4qxir3PUFjLr\nKovDJNWMZuhkzUdbP+JAXvAhpaHUoRLqjpgIZdVaV2EE3T2hlEpVSl0JDAVe9tVeKTVUKdXc/n8f\njJnHv6yQRRCiTW2rtZmpPOpaEXVwv8HGWhOu2Gca4VRq9VyLYWnHpWw/sJ3X57weUjXTUCrTCrGL\nZUlwSqkWQAEwEMOl9aDDh6CU6ges0Fpn2N8vBvIwni++w3BI/6+f48rMQYgb6uLpOdAxgZDP5y/5\nbVTFKPIn5Ue8mJ3UW7IWyZAWhBgi0IAbjt3dTHmKYAfy3NtzLTNJhTuYi0nKeqTwniBEEc8Bcue+\nnfBzj0YWZPkGiuIJNcInmHWiPbFqPWsHgcJ7xZkdHUQ5CEKI+Ez6+i4NzsKoE+AgRrJ8PRXZXTfd\nxaYZtYfe+sLqwTxQeK8QHUQ5CEKI+Ez6uqaStH+nUXldZdADbl3i80l/xiYKphQw//X5QSedWT2Y\nhzOLEeoGUQ6CECL+Bsge3cwnfVlJIB+Avyf9MZPHmFqEyBOrB3OzCYRC5BDlIAgh4jfpq23kk75q\n8wGEuwiRJ1YP5mYTCIXIIdFKghAisRRhU1uUVF2s2SBrN8c2Eq0kCFEiKzOLgikFjJk8hmM1x2ie\n3JyCmQVRGSBr8wH4etJnBdAI+NC9kJ9ZQo2SknyG+ECUgyCEiGNxn7LcMkiBY9XHGDtjbFRmDrX5\nABxmm6tHXk0ZZfADRrpqG9wK+UUi0c3KEFih7hCzkiCESF0lvIWCWRNXaVkpFw698Ew0VYTljqU+\nq+/ERG0lQUhE/C2vGY3YfLN1jbIys+jRrUfU5I6lPhMCI2YlQQiRWIvNN+sDyG6bzabqTVGRO9b6\nTPCPmJUEIURiKVopGKIpd7z2WTwihfcEIYrEazhnNOWO1z6LN0Q5CIIgCF6IQ1oQhDqltsWLhPqJ\nzBwEQfCL+AjiF5k5CIJQZwQqzS3Ub0Q5CILgF8lLSFxEOQiC4BdnXoIrkpeQEIjPQRAEv4jPIX6R\nUFZBEOoUyUuIT0Q5CIIgCF5ItJIgCIJgOaIcBEEQBC9EOQiCIAheiHIQBEEQvBDlIAiCIHghykEQ\nBEHwQpSDIAiC4IUoB0EQBMELUQ6CIAiCF6IcBEEQBC9EOQiCIAheiHIQBEEQvLBEOSil/qiU+kQp\ndVIpVWCi/USl1H6l1FGl1PNKqYZWyCEIgiBYg1Uzh71APrCgtoZKqUHAZCAXyASygcctkkMQBEGw\nAEuUg9b6X1rrZcARE81vAxZorb/WWv+IoVR+a4Uc0aS4uDjaIphC5LSOeJARRE6riRc5wyUaPofu\nwBaX91uAtkqpFlGQxTLi5QsjclpHPMgIIqfVxIuc4RIN5ZAG/Ojy/kdAAelRkEUQBEHwQa3KQSm1\nVillU0rV+Hh9EMI5K4EMl/cZgAYqQjiWIAiCUAdYukyoUiof6Ki1HhugzSLgW631NPv7q4FXtNYd\n/LSXNUIFQRBCIJxlQhtYIYBSKhloCCQDDZRSjYDTWusaH81fAl5QSi0GvgceBl7wd+xwLk4QBEEI\nDat8Do8AVcCDwCj7/w8DKKXOUUqVK6XOBtBarwJmAmuBUvtrukVyCIIgCBZgqVlJEARBqB9I+QxB\nEATBi5hTDsGU4lBKjVFKnbabrSrsf/vHmpz29lEpGaKUaqGUekspVamUKlVKjQjQ9jGlVLVHf2bG\ngFxPK6UOK6UOKaWergt5wpUzkn3n49zB/GaiVrrGrJxR/l2n2PulTCn1o1LqM6XU4ADto/W7Ni1n\nqP0Zc8qBIEpx2PlQa52htU63/w0lvDYU4qVkyHPASaANMBr4m1Kqa4D2//Toz7JoyqWUuhsYCvwc\nuBC4Vil1Vx3JFLKcdiLVd56Y+i7GQOmaYH7b0fpdNwB2A1dprZsBjwKvK6U6eTaMcn+altNO0P0Z\nc8ohyFIcUSMeSoYopVKB4cAjWusTWusNwDLg1ro+t4Vy3QY8o7Xer7XeDzwD3B6DckaNIL6LUS1d\nEw+/ba11ldb6Ca31Hvv7tzGCZnr5aB61/gxSzpCIOeUQAj2VUgeVUl8rpR5RSsXiNUWrZMh5GCHF\nJR7n7h5gn+vsJpxtSqnfx4BcvvoukPxWEmz/RaLvwiGeStfExO9aKdUOOBfY7uPjmOnPWuSEEPrT\nkjyHKPI+0ENrvUsp1R14HTgFRNQubYJAJUOORvC8jnP7K1XyGvB/wAHgMmCJUuqo1vq1KMrlq+/S\nLJbHH8HIGam+C4dofQ+DJSZ+10qpBsArwEKt9Tc+msREf5qQM6T+jKg2VhaX4tBal2mtd9n/3w48\nAdwYa3JSRyVDTMhZCTTz2C3D33nt0+PvtcFG4K9Y0J8+8OyPQHL56rvKOpDJF6bljGDfhUNclK6p\nq991MCilFMaA+xNwn59mUe9PM3KG2p8RVQ5a61ytdZLWOtnHy6pohLAzqutAzu3ARS7vLwYOaK3D\nerowIec3QLJSKttlt4vwP/X0OgUW9KcPvsHIpDcjl6++Myt/uAQjpyd11XfhUCffwwgR6b5cALQG\nhvup9ACx0Z9m5PRFrf0Zc/Z5pVSyUqoxLqU4lFGew1fbwUqptvb/L8DI1P5XrMmJUTLkDqVUV7s9\nMmDJEKvQWlcBbwJPKKVSlVJXYkT+vOyrvVJqqFKquf3/PsA46qA/g5TrJWCSUqqDUqoDMIkI9F2w\nckaq73wRxHcxKt/DYOWM5u/afs6/AxcAQ7XW1QGaRrs/TckZcn9qrWPqBTwG2IAal9ej9s/OAcqB\ns+3vZ2HUZ6oAdtr3TY41Oe3bJthlPQY8DzSMkJwtgLcwpsBlwM0un/UDyl3eLwYO22X/D/DHSMvl\nKZN92wzgB7tsT0X4+2hKzkj2ndnvov17WBEL38Ng5Izy77qTXcYq+/kr7Pd0RIz9rmuTM+z+lPIZ\ngiAIghcxZ1YSBEEQoo8oB0EQBMELUQ6CIAiCF6IcBEEQBC9EOQiCIAheiHIQBEEQvBDlIAiCIHgh\nykEQBEHwQpSDIAiC4MX/B4BXNRO7H//+AAAAAElFTkSuQmCC\n", "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "y_pred_idx = y_pred.reshape(-1) # a 1D array rather than a column vector\n", "plt.plot(X_test[y_pred_idx, 1], X_test[y_pred_idx, 2], 'go', label=\"Positive\")\n", "plt.plot(X_test[~y_pred_idx, 1], X_test[~y_pred_idx, 2], 'r^', label=\"Negative\")\n", "plt.legend()\n", "plt.show()" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Well, that looks pretty bad, doesn't it? But let's not forget that the Logistic Regression model has a linear decision boundary, so this is actually close to the best we can do with this model (unless we add more features, as we will show in a second)." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Now let's start over, but this time we will add all the bells and whistles, as listed in the exercise:\n", "* Define the graph within a `logistic_regression()` function that can be reused easily.\n", "* Save checkpoints using a `Saver` at regular intervals during training, and save the final model at the end of training.\n", "* Restore the last checkpoint upon startup if training was interrupted.\n", "* Define the graph using nice scopes so the graph looks good in TensorBoard.\n", "* Add summaries to visualize the learning curves in TensorBoard.\n", "* Try tweaking some hyperparameters such as the learning rate or the mini-batch size and look at the shape of the learning curve." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Before we start, we will add 4 more features to the inputs: ${x_1}^2$, ${x_2}^2$, ${x_1}^3$ and ${x_2}^3$. This was not part of the exercise, but it will demonstrate how adding features can improve the model. We will do this manually, but you could also add them using `sklearn.preprocessing.PolynomialFeatures`." ] }, { "cell_type": "code", "execution_count": 132, "metadata": { "collapsed": true }, "outputs": [], "source": [ "X_train_enhanced = np.c_[X_train,\n", " np.square(X_train[:, 1]),\n", " np.square(X_train[:, 2]),\n", " X_train[:, 1] ** 3,\n", " X_train[:, 2] ** 3]\n", "X_test_enhanced = np.c_[X_test,\n", " np.square(X_test[:, 1]),\n", " np.square(X_test[:, 2]),\n", " X_test[:, 1] ** 3,\n", " X_test[:, 2] ** 3]" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "This is what the \"enhanced\" training set looks like:" ] }, { "cell_type": "code", "execution_count": 133, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "array([[ 1.00000000e+00, -5.14696757e-02, 4.44198631e-01,\n", " 2.64912752e-03, 1.97312424e-01, -1.36349734e-04,\n", " 8.76459084e-02],\n", " [ 1.00000000e+00, 1.03201691e+00, -4.19741157e-01,\n", " 1.06505890e+00, 1.76182639e-01, 1.09915879e+00,\n", " -7.39511049e-02],\n", " [ 1.00000000e+00, 8.67891864e-01, -2.54827114e-01,\n", " 7.53236288e-01, 6.49368582e-02, 6.53727646e-01,\n", " -1.65476722e-02],\n", " [ 1.00000000e+00, 2.88850997e-01, -4.48668621e-01,\n", " 8.34348982e-02, 2.01303531e-01, 2.41002535e-02,\n", " -9.03185778e-02],\n", " [ 1.00000000e+00, -8.33439108e-01, 5.35056649e-01,\n", " 6.94620746e-01, 2.86285618e-01, -5.78924095e-01,\n", " 1.53179024e-01]])" ] }, "execution_count": 133, "metadata": {}, "output_type": "execute_result" } ], "source": [ "X_train_enhanced[:5]" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Ok, next let's reset the default graph:" ] }, { "cell_type": "code", "execution_count": 134, "metadata": { "collapsed": true }, "outputs": [], "source": [ "reset_graph()" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Now let's define the `logistic_regression()` function to create the graph. We will leave out the definition of the inputs `X` and the targets `y`. We could include them here, but leaving them out will make it easier to use this function in a wide range of use cases (e.g. perhaps we will want to add some preprocessing steps for the inputs before we feed them to the Logistic Regression model)." ] }, { "cell_type": "code", "execution_count": 135, "metadata": { "collapsed": true }, "outputs": [], "source": [ "def logistic_regression(X, y, initializer=None, seed=42, learning_rate=0.01):\n", " n_inputs_including_bias = int(X.get_shape()[1])\n", " with tf.name_scope(\"logistic_regression\"):\n", " with tf.name_scope(\"model\"):\n", " if initializer is None:\n", " initializer = tf.random_uniform([n_inputs_including_bias, 1], -1.0, 1.0, seed=seed)\n", " theta = tf.Variable(initializer, name=\"theta\")\n", " logits = tf.matmul(X, theta, name=\"logits\")\n", " y_proba = tf.sigmoid(logits)\n", " with tf.name_scope(\"train\"):\n", " loss = tf.losses.log_loss(y, y_proba, scope=\"loss\")\n", " optimizer = tf.train.GradientDescentOptimizer(learning_rate=learning_rate)\n", " training_op = optimizer.minimize(loss)\n", " loss_summary = tf.summary.scalar('log_loss', loss)\n", " with tf.name_scope(\"init\"):\n", " init = tf.global_variables_initializer()\n", " with tf.name_scope(\"save\"):\n", " saver = tf.train.Saver()\n", " return y_proba, loss, training_op, loss_summary, init, saver" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Let's create a little function to get the name of the log directory to save the summaries for Tensorboard:" ] }, { "cell_type": "code", "execution_count": 136, "metadata": { "collapsed": true }, "outputs": [], "source": [ "from datetime import datetime\n", "\n", "def log_dir(prefix=\"\"):\n", " now = datetime.utcnow().strftime(\"%Y%m%d%H%M%S\")\n", " root_logdir = \"tf_logs\"\n", " if prefix:\n", " prefix += \"-\"\n", " name = prefix + \"run-\" + now\n", " return \"{}/{}/\".format(root_logdir, name)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Next, let's create the graph, using the `logistic_regression()` function. We will also create the `FileWriter` to save the summaries to the log directory for Tensorboard:" ] }, { "cell_type": "code", "execution_count": 137, "metadata": {}, "outputs": [], "source": [ "n_inputs = 2 + 4\n", "logdir = log_dir(\"logreg\")\n", "\n", "X = tf.placeholder(tf.float32, shape=(None, n_inputs + 1), name=\"X\")\n", "y = tf.placeholder(tf.float32, shape=(None, 1), name=\"y\")\n", "\n", "y_proba, loss, training_op, loss_summary, init, saver = logistic_regression(X, y)\n", "\n", "file_writer = tf.summary.FileWriter(logdir, tf.get_default_graph())" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "At last we can train the model! We will start by checking whether a previous training session was interrupted, and if so we will load the checkpoint and continue training from the epoch number we saved. In this example we just save the epoch number to a separate file, but in chapter 11 we will see how to store the training step directly as part of the model, using a non-trainable variable called `global_step` that we pass to the optimizer's `minimize()` method.\n", "\n", "You can try interrupting training to verify that it does indeed restore the last checkpoint when you start it again." ] }, { "cell_type": "code", "execution_count": 138, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Epoch: 0 \tLoss: 0.629985\n", "Epoch: 500 \tLoss: 0.161224\n", "Epoch: 1000 \tLoss: 0.119032\n", "Epoch: 1500 \tLoss: 0.0973292\n", "Epoch: 2000 \tLoss: 0.0836979\n", "Epoch: 2500 \tLoss: 0.0743758\n", "Epoch: 3000 \tLoss: 0.0675021\n", "Epoch: 3500 \tLoss: 0.0622069\n", "Epoch: 4000 \tLoss: 0.0580268\n", "Epoch: 4500 \tLoss: 0.054563\n", "Epoch: 5000 \tLoss: 0.0517083\n", "Epoch: 5500 \tLoss: 0.0492377\n", "Epoch: 6000 \tLoss: 0.0471673\n", "Epoch: 6500 \tLoss: 0.0453766\n", "Epoch: 7000 \tLoss: 0.0438187\n", "Epoch: 7500 \tLoss: 0.0423742\n", "Epoch: 8000 \tLoss: 0.0410892\n", "Epoch: 8500 \tLoss: 0.0399709\n", "Epoch: 9000 \tLoss: 0.0389202\n", "Epoch: 9500 \tLoss: 0.0380107\n", "Epoch: 10000 \tLoss: 0.0371557\n" ] } ], "source": [ "n_epochs = 10001\n", "batch_size = 50\n", "n_batches = int(np.ceil(m / batch_size))\n", "\n", "checkpoint_path = \"/tmp/my_logreg_model.ckpt\"\n", "checkpoint_epoch_path = checkpoint_path + \".epoch\"\n", "final_model_path = \"./my_logreg_model\"\n", "\n", "with tf.Session() as sess:\n", " if os.path.isfile(checkpoint_epoch_path):\n", " # if the checkpoint file exists, restore the model and load the epoch number\n", " with open(checkpoint_epoch_path, \"rb\") as f:\n", " start_epoch = int(f.read())\n", " print(\"Training was interrupted. Continuing at epoch\", start_epoch)\n", " saver.restore(sess, checkpoint_path)\n", " else:\n", " start_epoch = 0\n", " sess.run(init)\n", "\n", " for epoch in range(start_epoch, n_epochs):\n", " for batch_index in range(n_batches):\n", " X_batch, y_batch = random_batch(X_train_enhanced, y_train, batch_size)\n", " sess.run(training_op, feed_dict={X: X_batch, y: y_batch})\n", " loss_val, summary_str = sess.run([loss, loss_summary], feed_dict={X: X_test_enhanced, y: y_test})\n", " file_writer.add_summary(summary_str, epoch)\n", " if epoch % 500 == 0:\n", " print(\"Epoch:\", epoch, \"\\tLoss:\", loss_val)\n", " saver.save(sess, checkpoint_path)\n", " with open(checkpoint_epoch_path, \"wb\") as f:\n", " f.write(b\"%d\" % (epoch + 1))\n", "\n", " saver.save(sess, final_model_path)\n", " y_proba_val = y_proba.eval(feed_dict={X: X_test_enhanced, y: y_test})\n", " os.remove(checkpoint_epoch_path)\n", "\n", "file_writer.close()" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Once again, we can make predictions by just classifying as positive all the instances whose estimated probability is greater or equal to 0.5:" ] }, { "cell_type": "code", "execution_count": 139, "metadata": { "collapsed": true }, "outputs": [], "source": [ "y_pred = (y_proba_val >= 0.5)" ] }, { "cell_type": "code", "execution_count": 140, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "0.97979797979797978" ] }, "execution_count": 140, "metadata": {}, "output_type": "execute_result" } ], "source": [ "precision_score(y_test, y_pred)" ] }, { "cell_type": "code", "execution_count": 141, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "0.97979797979797978" ] }, "execution_count": 141, "metadata": {}, "output_type": "execute_result" } ], "source": [ "recall_score(y_test, y_pred)" ] }, { "cell_type": "code", "execution_count": 142, "metadata": {}, "outputs": [ { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAAYcAAAEFCAYAAAAIZiutAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAIABJREFUeJztnXl8VNW9wL8n7DEJhEUEFJIGF5CnRVlcEBJ9BFtcEBVl\nUXxY7dMiWxUVi6JplaWC5flsSwWpIm5BhSqLC8SyWquylKdYYwIIyCJgEgEjmfP+uDOT2e7MnZk7\nW+b3/XzuJ5k7Z+793TN3zu+e33aU1hpBEARB8CQj0QIIgiAIyYcoB0EQBMEPUQ6CIAiCH6IcBEEQ\nBD9EOQiCIAh+iHIQBEEQ/BDlIAiCIPhhq3JQSv1KKfWRUuqEUmpBkHajlVInlVJVSqlq59/+dsoi\nCIIgRE5jm4+3BygBBgEtQrTdoLUWhSAIgpCE2KoctNZvAiilegOd7Dy2IAiCED8S6XPoqZQ6oJT6\nXCn1G6WU+D8EQRCSBLvNSlb5AOihtd6plDoXeBX4EZiRIHkEQRAEDxLytK61rtRa73T+vx14DLgh\nEbIIgiAI/iRq5hAIFXCnUlI2VhAEIQK01gHHVSvYHcraSCnVHGgENFZKNVNKNQrQ7kql1KnO/88B\nfgO8aXZcrXXSb4888kjCZRA5RUaRU+R0bdFit1npN8Ax4H5gpPP/h5RSZzjzGU53trsC2KqUqgbe\nAkqBJ2yWRRAEQYgQu0NZHwUeNXk726PdfcB9dp5bEARBsA8JH7WJwsLCRItgCZHTPlJBRhA57SZV\n5IwWZYdtKpYopXSyyygIgpBsKKXQUTikkylaSRCEFCMvL4+dO3cmWoy0pkuXLlRWVtp+XJk5CIIQ\nMc6n00SLkdaYfQfRzhzE5yAIgiD4IcpBEARB8EOUgyAIguCHKAdBEIQQ7N69m5ycnKD+lezs7Jg4\nhhOFKAdBEBokeXl5ZGZmkpOTQ4cOHRgzZgzHjh2L6FhnnHEGVVVVKGX4d4uKiliwwHuxy+rqavLy\n8qIVO2kQ5SAIgu1UVFYwatwoim4rYtS4UVRUVsT9GEop3n77baqqqvjkk0/46KOP+O1vfxu2HOmK\nKAdBEGylorKCgWMH8mL2i5Tll/Fi9osMHDswrMHdjmMAbjNQhw4d+NnPfsa//vUv9u3bxzXXXEOb\nNm0466yzePbZZ93tP/roI3r37k3Lli3p0KED9957LwA7d+4kIyMDh8PBb37zG9auXcvYsWPJyclh\n3LhxAGRkZPDVV1/x4Ycf0qFDBy8T1BtvvMH555/vlmn69Ol07dqVdu3acfPNN3P06NGwriseiHIQ\nBMFWps6eSvn55dDUuaMplJ9fztTZU+N6DE92797N8uXL6dmzJ8OHD6dz58588803vPbaa0yZMoU1\na9YAMH78eCZMmMB3331HeXk5w4YNcx/DZVL67W9/y2WXXcbTTz9NVVUVc+fO9Xq/b9++ZGVlsXr1\navdnX3rpJUaNGgXAH/7wB5YtW8batWvZu3cvubm53H333RFdVywR5SAIgq3sqdpTP6i7aAp7q/bG\n9RgAQ4YMoXXr1vTv35+ioiLuuOMO1q9fz8yZM2nSpAnnn38+v/jFL3jhhRcAaNKkCV9++SXffvst\nmZmZ9OnTx/K5PGcKN998M4sXLwYMX8Ty5csZPnw4APPmzeN3v/sdHTp0oEmTJjz88MOUlpbicDjC\nurZYI8pBEARb6ZTTCWp9dtZCx5yOcT0GwNKlSzl8+DAVFRX8z//8D3v37qV169ZkZma623Tp0oU9\ne/YAsGDBAnbs2ME555xD3759efvtt8M6n4sRI0bwxhtv8OOPP/L6669z4YUXcvrpxooFO3fu5Lrr\nrqN169a0bt2a7t2706RJE/bv3x/RuWKFKAdBEGylZFIJBVsK6gf3WijYUkDJpJK4HgPwCz3t2LEj\nhw8f5vvvv3fv27VrF506dQKgoKCAxYsXc/DgQSZPnswNN9zA8ePH/Y7rMiGZ0a1bN7p06cLy5ct5\n6aWXGDFihPu9zp07s2LFCg4fPszhw4c5cuQI33//PR06dAjr2mKNKAdBEGwlPy+fd59+l5HVIymq\nKGJk9Ujeffpd8vPy43qMQJx++ulccsklPPjgg/zwww9s3bqV+fPnu/0BL774IocOHQKgZcuWKKVo\n1MhYzNJT0bRv356vvvoq6LlGjBjB3LlzWbt2LTfeeKN7/y9/+UumTJnCrl27ADh48CDLli2L6rpi\nQqKXsrOw1J0WBCE5SebfZ35+vn7//ff99u/Zs0dfddVVunXr1rpr16563rx57vdGjRqlTz31VJ2d\nna179Oihly1bprXWurKyUmdkZOi6ujqttdYbN27UZ511lm7durUeP3681lrrjIwMXV5e7j7Wrl27\ndKNGjfTVV1/tdX6Hw6HnzJmjzz77bJ2Tk6O7du2qH3rooYiv0+w7cO6PeOyVqqyCIESMVGVNPFKV\nVRAEQYgbohwEQRAEP0Q5CIIgCH6IchAEQRD8EOUgCIIg+CHKQRAEQfBDlIMgCILghygHQRAEwQ9R\nDoIgCDHm5z//ubvya6ogykFIOrTWzHzgAcm8FaIiLy+P0047zatw3vz58ykqKorpeR999FFuvfVW\nr33Lly/nlltuiel57UaUg5B0rFqyhH3PPMM7r7+eaFGEKLBDyUdzDKUUdXV1PPXUU377hdCIchCS\nCq01q37/e2ZXV7Ny1qyQg4LMMpIXO5R8tMe47777ePLJJ6mqqvJ77/PPP6e4uJg2bdrQrVs3Xnvt\nNfd7hw8f5uqrr6Zly5b07duXqVOnctlll7nfnzBhAp07d6Zly5b07t2bdevWGfKuWsXjjz/OK6+8\nQnZ2Nj179gSgqKiIBQsWUFtbS25uLv/3f//nPtahQ4fIzMx0V4N966236NmzJ7m5ufTr149t27ZF\ndO3RIspBSCpWLVnCldu2oYDibdu4a+jQoAO/zDKSk3CVfKyO0atXLwoLC5k1a5bX/mPHjlFcXMyo\nUaM4dOgQL730EnfffTefffYZAHfffTfZ2dkcOHCAhQsX8te//tVrxtGnTx+2bt3KkSNHGDFiBDfe\neCO1tbUMGjSIKVOmcNNNN1FdXc2nn37qdd6mTZty/fXX89JLL7n3vfrqqxQWFtK2bVs++eQTbr/9\ndv7yl79w+PBhfvnLX3LNNdfw448/hn3t0SLKQUgaXINB8bFjAFx57Bj7ly1j1ZIlQdtHM3gIscFT\nyQ/ati0i5W3HMcDwATz99NN8++237n1vvfUW+fn53HrrrSil+OlPf8r111/vXq7z9ddf57HHHqNZ\ns2Z069aN0aNHex1zxIgRtGrVioyMDCZOnMgPP/zAjh07LMkzfPhw9xKiAIsXL2bkyJEAPPvss/z3\nf/83vXr1QinFLbfcQrNmzdi0aVNE1x4NohyEpMFzMABQwJ0OB889+CAOh4MZDzzAjPvvdysBuwYP\nwV58lfygY8fCVt52HMPFueeey1VXXcUTTzzh3rdz5042bdrkXqozNzeXxYsXs3//fg4ePMjJkyfd\ny3oCnHHGGV7HfPLJJ+nevTu5ubnk5uZSVVXlNguF4vLLL+fEiRN89NFH7Nq1iy1btjBkyBC3XE8+\n+aSXXF9//TV794a3drYdiHIQkoZt69ezoVcvpg0YwLju3RmnFJuAtpWVzJg8ma/nzuWTuXN55/XX\nLQ8e4pOIP4GUfLjK245jeDJt2jT+8pe/uNeK7ty5M4WFhV5LdVZVVfH000/Trl07mjRpwtdff+3+\n/O7du93/r127lpkzZ1JaWsqRI0c4cuQIOTk57nsslMNbKcWwYcNYvHgxixcv5qqrruKUU04BDCX0\n0EMPeclVU1PDTTfdFNF1R0PjuJ9REEy4b84cwBjQJ118MbO1RgH65EmG/ulPLDl+nEnAipkzcTgc\npoPHoOuvdx/T7ZPo3dtrvxA7tq1fT02vXmz0GCS11mStW2f5O7DjGJ4UFBRw0003MXfuXM477zwG\nDx7M/fffz6JFi7j55pvRWrNlyxays7M5++yzGTp0qFuh7Ny5k+eff54uXboAUFNTQ5MmTWjTpg21\ntbVMnz6d6upq97nat2/Pe++9Z6ymZqIohg8fzpAhQ2jbti2/+93v3PvvuOMOhg4dyhVXXEGfPn34\n/vvv+eCDDxgwYIBbgcSNaJaR892AXwEfASeABSHaTgT2AUeAZ4EmJu1Ml8cTGiYrXntNr8zM1Brc\n299ArwS9AvTjTZvquwcP1g/3768fGTDAvT3cv7+eOWGC+zgOh0OP79NHTwc9vk8f7XA4EnhVDZNk\n/n36LhO6e/du3aJFC3355ZdrrbX+4osv9ODBg3W7du1027Zt9RVXXKG3bNmitdb64MGDevDgwbpl\ny5a6T58++oEHHtD/+Z//qbXWuq6uTt9+++06JydHd+zYUc+aNcvrXN9++63u16+fzs3N1RdeeKHW\nWuuioiI9f/58L/m6du2q27Ztq3/88Uev/atWrdK9e/fWubm5umPHjnrYsGG6pqbG9DrNvgOSaZlQ\npdQQwAEMAlporceYtBsELASKnAriTWCj1npKgLbaThkbIlprZj34IPc98UTKx3Brrbm+Tx96tGhB\nRkYGWmvKP/2Un1RXkw3ci/FUQZ8+zNm0Kej1riwt5dORIzlQW8upTZtyweLFMnuwmXRZJvSBBx5g\n//79PPfcc4kWxY+UWCZUa/2m1noZcDhE01uB+Vrrz7XW3wElwH/ZKUs60ZDCOVctWUKXHTu4dPx4\nppWVcfE99zC8tpbHgPswzEdXAu03bw56vVprVs6axf7aWmYD+2trWTFzZloMZEL07Nixw51f8I9/\n/IP58+czdOjQBEsVXxLlkD4X2OLxegtwqlIqN0HypCy6AYVzBrqWbevXs6RtW0bn5HBby5bc1rIl\nL7VsyUft2rHVmXgUiFVLltB+82Z+hnWFIgguqqurGTp0KFlZWdx8883cd999XH311YkWK75EY5My\n2zBmAqY+B+BLoNjjdWMMc1TnAG1NbW2Ct31+RWamXllaGrCdw+HQM+6/Py5290jO5XA49C+vvVav\nsHAtVpgxfry+PjtbO5w+Cwfo67Oz9ZALLxTfg43I7zPxmH0HROlzSNTMoQbI8XidA2igOlDjadOm\nubeysrI4iJcaaG09FtzT9KR1bMM7IzFzrSwtZf/f/sYgG+LaAc7r14876uq8opluq62l6fbtMnsQ\nGiRlZWVeY2W02OqQdh9UqRKgkzZ3SL8IfKW1nup8fTmwSGvdMUBbHatBLNVZWVqKGj3aPaACrMzM\nRD3/vJfjVWtnaOiHHzKpb1+Kf/1r3rn9dq587jnbHbS+55q9cWNIJ7nWmpvPOov/+vJLrvS8vgDX\nYpVZEydS88kn7nNrrfny0085v7qafRblEkKTLg7pZCZWDmm7o5UaAU2Ah4HTgTuAk1rrOp92g4Dn\ngCuAb4BSYJPW+qEAxxTlYILvAAjOWPALLnDnDIC3ElnRogWvnn46C/79b8uDdzh4nsvq4L6ytJS/\nDR9O25MnOaIUdOtG63btAl5LPOUSQiPKIfHESjnY7Wt4BMN3UOexPQycgWEyOt2j7QQMxXAUyXOI\nGQ6HQ0/o29dte18OemlGRlC7fqT+Cd9zOcB4HeQ4dXV1emCHDrouwGfs8pNEIlegY8TLZ5NKdOnS\nRWOYhGVL0NalS5eA3w1R+hxi4pC2cxPlEB2eDmsH6AnOv8EGyRWvvaYnZGeH7RAOlLwWyrH8+KRJ\neixGgpvvZyKVI5BcK8KUK9Ax7JBFEOJFtMpBymc0cDzLEOw8eJChn3+OcjiAwCUntK4PJ500axbF\nQ4daNjv5ljzQWvNJRQWXrV0b0ISjtWbz4sW8DAzLzmZDz57uKfIpa9eyb9OmiOQIJNeuvDwWff45\nrc85p95kZbEUg8Ph4Llx43jZBlkEIWWIRrPEY0NmDrYxc8KEkCUnrIbGWiHU03awc9kph6dZyXOm\nZNVU9PikSXqpyYzDdYy6ujoxOwlJBWJWEuwikG1+fN++evrkyVH5HwKZroL5AezwEXhipmismIrq\n6ur0kFNOMZXFdYzHf/1rMTsJSYUoB8GLaBynAQveNWumhzVvHpX/IdCTfzD/RCS+CzPMFE1dXV1Q\n5eXi8UmT9FsecnjK4nnsIaecouuiVGKCYCfRKoeY5DnYiYSyhsfK0lJWjRnDoAUL2PrPf4ZVjM+u\n3ACt63MdFEZIhW/YbLAwXMBSiK4VzHJBPr3rLi744x+DhrZqrbmqQwcu3L+fDOd1lGdn85OePcm+\n4AL+49JL3cd+CyOGW0uYrJAkJFUoayw2ZOZgGc8n2WFdu+rxUZo5IrX72/nkHy2B/CxTL7tMDzvt\nNEtRW2bXEXBGAu7Zg/gghESDzBwEF66n5OJjxxijFAu0ZmKfPnQoKmJymOW8tYWnfzOsJuclCquZ\n5cGuw3PW4D4GRgSYds5MDsybF9EMThDsIKkypGOBKAdreA7mq5z7rgQeb9qUfY0acdULL4Rl6rA6\ngKYigcxnuysqOGfoUCY/9VRYx9hdXk5ddbXxQwSqs7Lo8ZOfsOPf/+blb75hzJln0nLfPn62cGHK\n95uQWohyEADvWcMkYLZz/0RgDjCxb1/mhFEqI9mf/sGQx45Fjlx+GrtqTXkq1qUZGTRzOFgl9ZyE\nOCM+hxTGzpIMLtv6bd2762Wu8hgemcd/a9bMbStvKLZwO7KWQ4XcRnM8T1/E8hYtJMxViCtE6XNI\nVMluAXtXcLtvzhwe/eADuhcX889+/Xj4sst4tEkTBjrfH/zDD6ycNYuVpaUNYtU4ra0vcqS1eYny\nVUuWcOW2bV7Z4tGwaskSBm3dyiwMP43CWDNXHT+e8osxCemFKIcEEc7gFg4uJdGid296/fgj7zr3\nK6B42zYWTpnSIFaNC2dQN1PCru/AynoYVtm2fj2v5OXxVUYGNwHTgI3ANgtyCkIyIcohQdj5xOr7\nZKy1UbNoLvBsdjaP9O/PtAEDWJ6XR9vKStuekhNFOIN6MCXs+R0AtvTLvbNn0yonhz86HDiys9H9\n+6MGDOD7AQPY2KsXW9au9fuuYrnwkiBETDQ2qXhsNECfQ6TlIcz8Bb6290D5CXaXpEgk4eRRBMvV\nsFJrKhLZXBVgl5tkhvt+V1J2Q4gFSPmM1CPSJLFAA4mvQ9WzLISnElj+6qtJk5gWLVYH9XgrxFDn\nC/ZdpaqiFpKXaJWDhLImgEjCRLUOvPym7wpnnmUhXKzMzGRpURGnOuPxrZ4z1Yl3rsbK0lIYPZor\nPc63IjOTDOf5gn1XDSWHREgeJM8hTQi0zGXx0KF+WcxXtW/PhWefTUYaKQEz4p2rMWviRHa98w7q\ns89orTWHlUJ360bn4mLunT3b67tyANefcgqvf/+9+7ub2LcvHQYMYPL06ZIPIUSN5DmkAWbmioZk\nKmoIBDMr+ZoSV4BftddIK+AKQiAQs1LDx8w8ko6momQmmBlr67p1XrOYrV9+SXZ1NY2ys+nctSta\nR1YBVxDMiHbmIMuEpgC+y2+CoQR+cuaZogSSCLPvKWvdupDf08rSUi4ZPZpBwEqfpVsFIRHIzCHF\n0NqeekJC4nF9l/c+/ji/vuSSiCrgCoIZ0c4cJAkuxbCz5IaQWFzf5YzJk21PxhOEaJGZQwqhTcJZ\nhdTD87uUCDMhFojPIY1YWVrK8Y8/BuqfLMUunZp4lu64p7oaNW6cbd+lmB4FOxCzUhKitX+9Ha01\nC6dMofnJk7yDPUXihMSgtf0F/zwR06NgB6IckpBAP+6VpaVklpczB2M5ShC7dKoSi4J/LlyKpyFU\n3hUSiyiHJMPsx71s4UKuUwoFFGVkcHv37mzs1Yut69YlVmAhLLTWzJsxg/UXXsi0AQPcW6DvMtAM\nMhR2r08hpDHRZNDFYyPNMqQbekXVdCecKqzhVmw1u0/q6uoazOp/gnWQleAaDtrEFr2ytFRCHRsA\nru/X6up14ZqHzMxVMyZPFh+EEDYSypokaK2567rrGPLuu15VPVdmZvK3oiLaSpmMlCdQ8USzCCXP\nCq+elV2DEajQoENrPt6xg7f275fw5zRDqrI2EFaWlvI/I0bwkzPPpE27du79ogQaBtojr8EzC/rJ\nDRv4/ZQpXmGnZm0jGdjDUUhCw0KqsjYAfBeBcTgcAVd9M1sJTkh+zBZ4evzXv/bzK3iuJufalmdm\n6hWvvRbW9y++qvSGKH0OkgSXBASKMNFaG3bi3r3dT3ruEFePfemM1qmT7BWoKJ/D4WDHiy/ycnU1\nE2fNYvM//sHk6dPZtn49u/LyWO65LkReHo7nnqPp2rWWv/9gIbNy/wihELNSgtEBTAgT+/QBpZjj\nUSYDkNIZPqwsLWXVmDEMWrCArf/8Z0ooCU88TT5vNWvGC0oxZtGigIs4TezbF7Rmzj/+Yfn7j/di\nR0JykVRmJSAXeAOoASqA4SbtHgFqgSqg2vk3z6StrVOtaLHbtBPI3PB406b6rWbN3KaHlaWlAUNc\n0xlPk8mwrl31+DBCPpOBQCaf8aDH9+njtYiTA/QM0MuaNtVPNG0q379gGZIslPUZ4ATQDhgF/FEp\n1c2k7cta6xytdbbzb6XNssSEQNnLWoefrORi2/r1bOjVy50M9Uj//nzcrBk//+EHwAhnXTFzJitj\nWG4hFXGZTAAjczzFMoIDmXyuBNpv3syyv/7VfU+M6d6dikaNeCYjg4zaWqD++3c4HBHfd4IQkmg0\ni+cGZAI/AAUe+54HHg/Q9hHgeYvHtUuRRk0gx7HW1pOVXLMOz6Qk35lIqJmELAfq/T2scG4up22q\n9MnMCRP0w/3764f799cjs7P1w6Afds4SPIMSXNd5m1LuWUYwZ7YguCCJHNJnASe11uUe+7YA/U3a\nX62UOgTsA/5Xa/0nG2WJCYEcx8VDh7qTlSbNmkXx0KGmtmB3/f6TJzkwbx7v9O7t53gO5Ljc+uWX\n/BtY1bIlhz//nNbnnENu27ZkrVuXlo5Fz1nDKmC2c/+Vx46F/A6SBZfN33MFOBcrPYISXPfbEKW4\nvVs3OjvDnD2d2alyzUKKEY1m8dyAfsBen32/AFYHaHsOcBrGbPpiYC9wk8lx7VaoEWEWFuhpHw72\nNO/5+SGnnKLrnPbl8QFmIqE+n+7hiK6n7tu6d9fLMjJSekblupZHBgxwbw/3769njB8fNAw1Eh+U\nhEKnF0Q5c7BTOfwUqPHZNwlYauGz9wOvmbxnd59FRCBzz/IWLfRtZ55pKY7c8/N/A73SxPFs5fyp\nNgDGCrOBdeaECYkWLWrM8iKiqbUVbq0mIbWJVjnYaVb6AmislCrQ9aal84HtFj6rAdM58bRp09z/\nFxYWUlhYGLmUERLI3LPzwAGG/vvfIePItXbWyXE6lAcDEwFdW8sDzjaDgphEfD8frG060ZDDMQPd\nb1prstat8zI3gbX8Bfc9JGaoBktZWRllZWW2Hc/WPAel1GKMgf4OoCfwFnCJ1vozn3bXAH/XWh9V\nSvUBXgce0FovCnBMbaeMdmI1jtwznt3FW8C/wK0cANPyBoE+n66lECoqK5g6eyp7qvbQKacTJZNK\nyM/LT7RYcUNrzfV9+tCjRQsyMjK89gfLX5AyGulHUtVWUkrlAguAgcAh4H6t9StKqX7Acq11jrPd\nYqAYaAp8jeGQ/l+TYyatcrCKS4nsLi+nzllAr+b4cY4CnZo3p1F2Np27djX9gTfEZKZIBvmKygoG\njh1I+fnlxp1TCwVbCnj36XfTRkG4Ev+ufO45y4O71vbVahJSh6RSDrGgISgHX7ROnbIPsSDSQX7U\nuFG8mP2i8RkXtTCyeiSL5vpNOhscnoN8OIO7zDy9SZfZZ7TKQWorJYB0r5E0dfbUesUA0BTKzy9n\n6uypQQf5PVV7oI3Pzqawt2pvzGRNJgKFUlu5f4L5L9Lt/vN6MGkD1MKmsZvSavZpFVEOcUYcg5EP\n8p1yOhlFV3xmDh1zOtotYtIRTVBCqpoeY0GkDybpiKwEF2dkjV+PQd4TC4N8yaQSCrYU1H/WaY4q\nmVQSEzmTiUDlNorT9P6Jhj1Ve+oVw1GgDFgP7218j4rKisQJloSIcogjrqe/dK+RFOkgn5+Xz7tP\nv8vI6pEUVRQxsnpk2pgDPGtwjevenVEZGSzPy2PrunWJFi2lyGmUY9x3R4EPgUuAIthfvJ+BYweK\ngvBAHNJxRByD9bicgnur9tIxp2ODdQraTaROacFg4A0DeW/be8Zj8VD8TJR5a/LIOyuvQTiqJVop\nhWiIIalCfAmWr5DuUXChqKisoPuN3TlRfALWA0UBGr0PXAh8DM1rmlPcs5inpj6VkkpClIMNyI9K\nSAVC5SuY5UDI/W3gFQpdhmFS8pk5UIbRsUWkfC5NtMpBfA4EXqNBEJKNYMt+ekbB+fqx5P428HJG\n/xRYg5ffizWAg3rFAF7RTOlG2oeySmhpcpMuCUuh0Frz3IwZnG2h3pJnDoTc3/V4hUK3AvoCa6FN\nbRua/9icPU33wDFgA4byaOX8YBrl0niS9soh0sQiIfZIwlI9q5YsoeOOHVwaoGyG29wUIAdC7u96\nSiaVsGnspvo8h0woyClgwQMLuKXkFuiF25TEGgzl0Yq0yaXxJa3NShJamtwES1hKJ4KZjMBwUl/+\nySd+5qZVS5bI/e2BWSj0vFfnsavXLq/7jCJgM2mVS+NLWs8cgtlw0/XpKplI93IZLkI9/b+1cCFH\n6+p4t3t3WjtXitNac+C55xgi97cX+Xn5fpnQZvdZq2OtGFw9mJKn09OUmdbKQWrOJDfpXC7DRaiy\nGVprmn77LYscDiZlZ/PImjVun8KsiRPZUFOTdvd3uH4qs/tscN/BaV1SQ0JZhaRFSnSHTpyUdRq8\nieSeaaj3meQ5JBkSU24v6Z5JHSxx8t7Zs2WdBh8iLeveEO8zUQ5JRiSLsQhCJEg5Fn+KbiuiLL/M\nf39FEasXro6/QAlEkuCSiFBRJYJgJ57F+B7p358xZ5zBhl69wirGp7Vm5gMPNJh7NdKKv4I/MnOw\nEbH/Coki0hlrQ5vpNlT/QSTIzCFJiCZnoqE9vQnxJdIZa0Oc6aZzWXfb0Von9WaImPyseO01vTIz\nU2twb8sVax7hAAAcJElEQVQzM/XK0lJLn52QnW2prSD44nnvrbB4z0XzOSE1cI6dEY+9MnOwia3r\n1vFKmzY80r9/WIux6Ab49CbED9f9E+6MNdLPCemDKAebOK9fP1oePcrF99xDo+xsXnA4aJydzb2z\nZwf9nCwb6k9FZQWjxo2i6LYiRo0b5bc6V6j304lgWf5WP6eBWUDx1q1y/wluxCFtA9qjzv6YM8/k\npq+/ZtDx49zVuDFDXnqJK2+4IeTnJE7dIJRD8e/r/s7guwdT06oGGgHnQkFlejocIfIFpDw/t/Pg\nQVrs2MHxs86i+6BBsvBUAyFah3TCfQqhNlLA5+Bpu30zI0OvAL0C9HjQw7p21Q6HI+TnXFu6235H\n3jNSMwXNNI9tCnrkPSP1VxVf6ay+WfXvT0FzMZpfGe+nMw6HQ8+4/36/e81sv+f7E/r21Q4w/pq0\nE1IPxOeQWLSP7fYah4OVwEpgDpBZXs6qJUsCftYzTt21bQwzTr0h4Gkmenfju0ZNfU+cxfamzp5K\nzRU1/tUzt6dfMT5fzBb0CbXQj5g1BTPSuvCeHQSy+RYqxQKnKWyIUixduDCgaUmm74ZiGDBmALtP\n7jY8YC2Ad4GWQBOMDj3XSGIyq55JXXonObkeUHwX9DHb7/c5k6J+QnojM4co8X36f6R/f55v3Jg6\n4B2MmUTzQ4dwOBySyxCAidMmsvv4brgMYxZwHsYjS6Hz9SXQeGNj7hx2p2n2a9bRrLSst+/C7Ol/\n1ZIlDNq61dTZHKkzW0gPxCFtM2+/8gp/vvlmlgKTgNnAqsxMPr3rLg7Mm9dgMlHton3v9hwoPlBv\nKioj4MLvI6tHcuewOxl87+B601ItZL2fxdu/f5v+/frHW/SkQJsENTy5YQO/vuQSij/8kHeAQcAq\nn2CHSJ3ZQmoghfeSjCvy8xlXWcm1wJvA4tNP55yf/ISPd+zgrf370z4ayZf2lzqVg4s1GDMGHy76\n7CIO1hykPK8ctgN1xozh7WfSVzGAefG9T++6i57PPMOq48eZjfGgUtyiBRkvvCAPJ2mClM9IIurq\n6mi2axfXOF9fC/x45AgX/epXjKuu9pu2ay1lMy465yJvU5EioOnom13fGOGt7TBMTldAzdU1zHt1\nXlrnPbjMmq7Ce4/078/GXr3Y/P77vJKXx+UZGSigKCODV7p04c/Tp6f1/SZYR5SDjdw1ZAi/cji8\nbLi3f/89T40bFzATNVQkSTrw1NSn6PzPzvUK4VzgbepfO01H7Tu29zY1gbGm9N5yBo4dyIvZL1KW\nX8aL2S8ycOzAtFEQ982Zw6MffMDF99xDy6NHuWTcOB794ANe/uQTWuXkcJXDAcDVDgfHT56k844d\naX2/CdYR5WATWmu+WL2aFcAoYBxwI7BQKRodOGC6+Hu6l83Iz8un7M9l5K3Jg1UYkUrnARuA96Fx\naWPe/v3bdO3YNfCMYu839QlzYCiM88uZOntqHK8isXhGJXk+eHg6m8EIq56T5vebYB3xOdjEytJS\nTo4cyfu1tfU2XmB+RgYdzjmHNs6F38H4MR/KyuKasjIp7+3EnRlt4lMwy5xul9WOTd02+R0vnRZ3\nCVQqfuu6dV7O5p0HD3Ld558beThyv6UFkiGdJMycMEFflZur33RmOr8BenBmpr6uUyc9c8IEr7ae\nWakaJDvVyVcVX+mR94zURaOL3BnRwd7/YO0HOu/iPNOM6nTAyr0k95s/rnupcHRhwHutIUCUGdIJ\nH/xDCpgiyiGcH6CUzYieryq+0gWDCzS/cpbQ8CipUTC4QH9V8VVaDABW7iW537xx3zsB7pmGRLTK\nwVazklIqF1gADAQOAlO01i+ZtJ0B3I4Rmr1Aa32/STttp4yxIpz1fCW+PHq8FpI/CmwGfoBTvj2F\nHv/Rg9OyTuPTfZ+y68xdDTr01cq9JPebN173jgtnLs2iuYsSJpfdJFWeg1LKpQjGABdgxJ1crLX+\nzKfdL4EJwOXOXe8Bf9BazwtwzJRQDvIDjC9+C8kfBT7EyJFoCryP4dj+xGOfM/Jp68tb07KCq2Dg\nd++49lv0U1VUVjB19lT2VO2hU04nSiaVJOX9FK1ysK22klIqExgKdNdaHwfWK6WWAbcAU3ya3wo8\nqbXe5/zsk8AvAD/lkCqIAogv7lIarqe/zdQrATDi8Lb77GsKNVfUMKFkAkvnL42rvELy4HfvANRa\nq8/lFRjRxvjcprGbGmTJeDtDWc8CTmqtyz32bcGIXPflXOd7odoJQkBKJpVQsKWgPrz1e4zw1zUY\nJTh+AOoImBvxzqfvpE0ehNaSaOmL373jjHy7c9idIZMpp86emjah03YqhyzgO5993wHZFtp+59wn\nCJbwXEj+om0X0bi2sVGTyVmsDw3sJWBuxImsEw3yxxwISbT0x/PeKaooYmT1SBY8sIAx08eETKbc\nU7Un4ANHQywZb6dyqAFyfPblANUW2uY49wmCZfLz8lk0dxEFHQs4Ofik9zoPV0Az1YyMtzK8nhBZ\nA1zYMH/Mvmgt65Ob4bp3Vi9czaK5i5j36jxLMwKzysANsWS8nes5fAE0VkoVeJiWzsew/Pqy3fne\nP52vf2rSDoBp06a5/y8sLKSwsNAGcYWGgtk6Dz0v7En75u1Zunap8RikgL5AJnSsa3g/Zl8ClfKW\nxLfAmN1Dvg8RJZNK2DR2k18yZsnTiS8ZX1ZWRllZmW3HsztaaTHGhP4OoCfwFnCJSbTSOIyQVzCW\nPviD1vovAY6ZEtFKQmywEhkSLDSxZFJJ0DWprZ4j1dBa1icPh3DCW133y96qvXTM6Zi090uyhbJ6\n5jkcAu7XWr+ilOoHLNda53i0nY6hRDTwF631gybHFOWQppiVzPCNDAnVLtiP2eo5Uo1w8m6Ehnkf\nJJVyiAWJUg5aa2Y9+CD3PfGEPGkliHg8zTXUhCjJuwmfVJkRWCVp8hwaGu4oj9695UkrQVi1A0O9\ngzGW50glRAGET6T3UENFSnYHQKI8koN4RIakU/SJIISDKIcAmC3YDobimHH//cyQxKKYY5asVDLJ\nvsiQeJxDEFIR8Tn4ECrKY2VpKc/dcgunKsVVsh5vzImHHbih2ZoFAcQhbTvBojyKhw5l4kUXwT/+\nwRxgYt++zJHQQEFICRpiyHIwxCFtM9vWr6emVy82+kZ5rFuH1pr2mzdzAUY+1X9u3iyJRQ2EVBk4\nUkXOWBJJH6RTwTy7kJmDRbTWXrMGl8lJZg+pT6rEuKeKnLEk0j5oqCHLwYh25iAOaYusWrKE9ps3\n8zNwL9ruOXsQUpdUqbTpJedRYAOUV5Vz+YjL06bKbKTflVnBvPc+fi9t+i5cRDlYZNv69XzUrh2L\nW7bkNuc2OieH19u2Zeu6dYkWT4iCVKm06ZbTtbDRJcAVUFlUGbCCaDJQUVkRsgx2OO29vqujGOXZ\n18N7G4MP8mYhy/sb7U/avks0YlYS0p5wTQ6Jsvu75dyAoRiS3EQSrgnISnt3HxzDe+W/CI7NGtyF\nGJOt7+xAzEqCECXh5Dq4BplQdf9jKqfJIkbJNtMJxwRUUVnB5SMuD9ne3Qcf47fKXzDzkmsNh1Pf\nOdVQChswFEMrkrLvkgFRDkLaE2jxF7Mn0ET6J1xy5p3MS4msbqvmOpfCrWxcGbK9e5CvOzVsBZmf\nl8/AiwfCpUAhhmKApOy7ZECUgyBQX1dn/rT5AIyZNia0zdtFHJ888/PyWb14dUpkdVstTeJWuI2w\n1D4/L5+BFw6MSEFKRrx1RDkIghMrJqNkqMUUzkwnkVgdiN0K96cYJh8LA3ekg3yq9F0yIA5pQXBi\nxTEtuQbhYaU0iVe/HwU2A3WQdzKP1YtXm/br39f9ndGTR3O07iitGrXirzP/Sv9+/WN+TamClM8Q\nBJsouq2Isvwy//0VRaxeuNr9Wmox2UskCtfqAk/pnEkuykEQbCIds2iThXAVbrRLw6YDohwEwSYi\nictP96dTu7Hap8FmeR1zOoqSR5SDINiK1SdY8T3Yh6vPv9z7Jdu/3k7NFTVGnx6EzLJMWpzSgkaO\nRlx87sXMmTaH/Lz8oDOHPVV7/BXHUWi/rj3dzu2WNopclIMg2EQ4MwExQdmDl5L1zPx2lQjxyIBm\nNZzR4gw+WPABgKlynjp7qvd3cxTYBFxOWilyyZAWBIsEq9sTbuZzrPIdwq1FlOp4JRVq6vt0M4Zi\nOIa7fhJNYHfNbqbOnho0JNUvzPVj6hUDJG1hxWRD1nMQ0oJQ9fyDZT4Hmgm48x18Zg7R5Dsk25oD\n8fCp7KnaY1wrGGWOXX2qCVg/ieVQvrccqE9c9MWlOFzmwe112znQ9IB3IymZERKZOQhpQaiyF+HO\nBAIlYZ2x8Qyqv6+O+Kk/nqU5Qs1Q4lVDyiup0JUEdxA4AKzGeHx1LcrYFPg5fLP3m5DHdSmO1QtX\nR5xNne6IchDSglCDf06jnIADSHaj7IDH8zVrXLvnWlQTxbLTlkU8mMarNIeVgT9eispLybYCuoJa\nq+A6YChwGcbs4Wi9HKd1Ps3y8SsqK6j+vprmK5rD+87jSMkMS4hyENKCUGUvVJ0ynlQ9ZgKsNvab\nPWV7Pp1m5WSxq9eusAdTz2NXflFpPDWbyOjbPlKfhJWBP16KylfJ5n2Zh75Ke8lGEYYPAoyB/dQC\nr2OY9YlLCS47bRknrj0Bl0GLdS24ds+1Dd4ZbQficxDSgpJJJWwau8kvuqXkaePp8Tu+g4swImY0\nhv37Iti/e78lP4CX7dxFiMHUz8fQCRqvbMzJi09CO38Z7fJJWJE1HJ9KtL4JT99B0W1FVDat9JMN\njV9/uM5t1ieBlODx4uNkVWeJYrCAzByEtCBUwbVOOZ0gE6OUc5Hzb6Zh37ZiXomkIF+gwevklSfJ\n+yQvoIx2mXqsyGq1sJ3dvgkz2dp/3z5gkbxgfZLoCrqpjigHIW3wNAMtmrvIa5AxGwxP63yapQEm\nkiqhZoNX/ln5AWW0a7CzIqvV6qVmg/OEkglhyRRKto2vbfTrDwjeJ8lQQTeVEeUgCJgPhgWnFlga\nYCIpBW02eGU3yg5oQ7drsLMqazBl6sJscH7n03cimj2E24/B+kTWbogOyZAWhCBEWjHUig0+0LE7\n/7Mz+kfN7u67YTtQB1lHs3j7mbc54/Qzkq5kh1mmOGth5E9jny1utTprOlbQlfIZghBjwhlgIi3e\n5zp2TVUNS7OWwid4JX9lvZ/F1pe3AiTVYFdRWUH3G7tzovhEfaLaGqAvFB3xLnUeSxmSqU+SBVEO\ngpBEmD1J560JvnCNi6LbiijbWVZfY8jjGMlat2nIbUNY+tVSw0itMJLZMoPLKxVtY0+0ykFCWQXB\nRszCRCsbVzJw7EBrfog64h5l4xqsyw+U882ub2jfsT1dO3a1NGjPmTaHf439l2mYcKBzmYWfAqI0\nkgSZOQiCjZja4J0VR0M9/VdUVtDj5z04dt2xuM0cApnCWANcAAWV1nwawUw7vrOEmqoalnZa6nd9\n13xzDdv3b08qn0oqI2YlQUgi/r7u7wy+d3D9mgQeNnha+S856ktFZQWX3nop+37c51ViuvM/O1P2\n57KYDJLRKrRgBFI8zd9pzol+J4xyGR60f6c9+wv3p4w5LdlJmpLdSqlcpdQbSqkapVSFUmp4kLaP\nKKVqlVJVSqlq5988u2QRhERQUVnBmOljqOlVA29g1PLZgFsxWAk7nTp7Kvsu2wfdMY7xurGd2HOC\nMdPGxKSMt1k4qquEdjTmrEB5ECeKTxhltD2pBV2r425OE8yx0+fwDHACI/H/AuBtpdRmrfVnJu1f\n1lrfauP5BSGheA2EgzAKxl1GQDu8mUN2T9UeaAR8hlF8zvnZA8sPcCD3AGTaX8bbrFSGq4R2OHkU\nvtdVfqA8oA+meU1zTtSe8OqbHuf2YGmtv7lJktYSgy3KQSmViVFDsbvW+jiwXim1DLgFmGLHOQQh\n2fFyRrfCmDFsgFbHWjG472BKni5xx96bOWQ75XQynqpdYazgLlXNBqAw+DoTkVAyqYSlNy/1NoW9\nDzggY1kG1X2qqaissOR38L2urLIsKMBvwC/uWUx2dXa9j8KpNMNxbAuxxa6Zw1nASa11uce+LUD/\nIJ+5Wil1CNgH/K/W+k82ySIICcHvCbwVcAkMrh7sNZAHqwdUMqmEJUOXcKLpCe+DNwUOY/gvFJS3\nqf+p2VH47tzTz+XDDR8a8h/CGBmGgKOpg2W1y9g+dnvI2Uqg66rpX0PW+1leiqdgSwFPPf1UwGN5\nLtLjUhrijE4MdimHLOA7n33fAYGL4cMrwJ+B/Ri1MJcopY5orV+xSR5BiDuhKr+6CFYVNT8vn+Ke\nxSyrXeZv5mmJMaOohX+9/y+378GOSq1dO3blw7M/NM5ZhneeRVNrs5WA19UOepzeg4LqAksDvtnq\nbkL8saQclFJrgAEYLipf1gPjMG5dT3KA6kDH01p/7vFyo1LqD8ANGErDj2nTprn/LywspLCw0IrY\nghBXfJenNBsIQ5XDfmrqU2wf6x3SyWqMxyiMz9VcUeOuxhrO8qZmeCk2z7WcXVhwDJtdV0HHAhnw\n40BZWRllZWW2Hc+WUFanz+EwcK7LtKSU+iuwR2sd0ueglJoM9NFa3xDgPQllFRoUVkpseOYNbN26\nlW8HfOsX+llUUYRGU5Zf5neOUCGznrK4TFItaYlupPlw64fsLw4/pDSSOlRC7EiKUFat9TGMoLvH\nlFKZSqlLgWuAFwK1V0pdo5Rq5fy/D8bM4007ZBGERBNqtTYrlUc9K6Je2e9KY60JT5wzjWgqtfqu\nxbC001K279/Oq3NejaiaaSSVaYXkxbYkOKVULrAAGIjh0rrf5UNQSvUDlmutc5yvFwPFGM8XX2M4\npP/X5LgycxBShlg8PQc7JhDx+cyS30ZWj6RkUknci9lJvSV7kQxpQUgigg240djdrZSnCHcgL7qt\nyDaTVLSDuZik7EcK7wlCAvEdIL/c+yX8h08jG7J8g0XxRBrhE8460b7YtZ61i2DhveLMTgyiHAQh\nQgImfX2dBadh1AlwkSRZvr6K7M5hd7JpeujQ20DYPZgHC+8VEoMoB0GIkIBJX1fUkPW3LGqurgl7\nwI0lAZ/0p29iwQMLmPfqvLCTzuwezKOZxQixQZSDIESI2QDZo7v1pC87CeYDMHvSHz15tKVFiHyx\nezC3mkAoxA9RDoIQIaZJX6fGP+krlA8g2kWIfLF7MLeaQCjED4lWEoQISaYIm1BRUrFYs0HWbk5u\nJFpJEBJEfl4+Cx5YwOjJozlad5RWjVqxYOaChAyQoXwAgZ70WQ40AzZ4F/KzSqRRUpLPkBqIchCE\nCHEt7lNZVAlN4WjtUcZMH5OQmUMoH4DLbHP5iMuppBK+xUhXbYdXIb94JLrZGQIrxA4xKwlChMQq\n4S0SrJq4KiorOO+a8+qjqeIsdzL1WUMnKWorCUI6Yra8ZiJi863WNcrPy6dH9x4JkzuZ+kwIjpiV\nBCFCki0236oPoODUAjbVbkqI3MnWZ4I5YlYShAhJpmilcEik3KnaZ6mIFN4ThASSquGciZQ7Vfss\n1RDlIAiCIPghDmlBEGJKqMWLhIaJzBwEQTBFfASpi8wcBEGIGcFKcwsNG1EOgiCYInkJ6YsoB0EQ\nTHHnJXgieQlpgfgcBEEwRXwOqYuEsgqCEFMkLyE1EeUgCIIg+CHRSoIgCILtiHIQBEEQ/BDlIAiC\nIPghykEQBEHwQ5SDIAiC4IcoB0EQBMEPUQ6CIAiCH6IcBEEQBD9EOQiCIAh+iHIQBEEQ/BDlIAiC\nIPghykEQBEHwwxbloJT6lVLqI6XUCaXUAgvtJyql9imljiilnlVKNbFDDkEQBMEe7Jo57AFKgPmh\nGiqlBgGTgSIgDygAHrVJDkEQBMEGbFEOWus3tdbLgMMWmt8KzNdaf661/g5DqfyXHXIkkrKyskSL\nYAmR0z5SQUYQOe0mVeSMlkT4HM4Ftni83gKcqpTKTYAstpEqN4zIaR+pICOInHaTKnJGSyKUQxbw\nncfr7wAFZCdAFkEQBCEAIZWDUmqNUsqhlKoLsP09gnPWADker3MADVRHcCxBEAQhBti6TKhSqgTo\npLUeE6TNi8BXWuupzteXA4u01h1N2ssaoYIgCBEQzTKhje0QQCnVCGgCNAIaK6WaASe11nUBmj8P\nPKeUWgx8AzwEPGd27GguThAEQYgMu3wOvwGOAfcDI53/PwSglDpDKVWllDodQGu9CpgJrAEqnNs0\nm+QQBEEQbMBWs5IgCILQMJDyGYIgCIIfSaccwinFoZQarZQ66TRbVTv/9k82OZ3tE1IyRCmVq5R6\nQylVo5SqUEoND9L2EaVUrU9/5iWBXDOUUoeUUgeVUjNiIU+0csaz7wKcO5zfTMJK11iVM8G/66bO\nfqlUSn2nlPpYKXVlkPaJ+l1bljPS/kw65UAYpTicbNBa52its51/IwmvjYRUKRnyDHACaAeMAv6o\nlOoWpP3LPv1ZmUi5lFK/BK4B/gM4D7hKKXVnjGSKWE4n8eo7Xyzdi0lQuiac33aifteNgV3AZVrr\nlsDDwKtKqc6+DRPcn5bldBJ2fyadcgizFEfCSIWSIUqpTGAo8But9XGt9XpgGXBLrM9to1y3Ak9q\nrfdprfcBTwK3JaGcCSOMezGhpWtS4bettT6mtX5Ma73b+fptjKCZCwM0T1h/hilnRCSdcoiAnkqp\nA0qpz5VSv1FKJeM1JapkyFkYIcXlPuc+N8hnrnaacLYppf47CeQK1HfB5LeTcPsvHn0XDalUuiYp\nftdKqfbAmcD2AG8nTX+GkBMi6E9b8hwSyAdAD631TqXUucCrwI9AXO3SFghWMuRIHM/rOrdZqZJX\ngD8D+4GLgCVKqSNa61cSKFegvsuyWR4zwpEzXn0XDYm6D8MlKX7XSqnGwCJgodb6iwBNkqI/LcgZ\nUX/GVRsrm0txaK0rtdY7nf9vBx4Dbkg2OYlRyRALctYALX0+lmN2Xuf0+BttsBH4Azb0ZwB8+yOY\nXIH6riYGMgXCspxx7LtoSInSNbH6XYeDUkphDLg/APeYNEt4f1qRM9L+jKty0FoXaa0ztNaNAmx2\nRSNEnVEdAzm3A+d7vP4psF9rHdXThQU5vwAaKaUKPD52PuZTT79TYEN/BuALjEx6K3IF6jur8kdL\nOHL6Equ+i4aY3IdxIt59OR9oCww1qfQAydGfVuQMRMj+TDr7vFKqkVKqOR6lOJRRniNQ2yuVUqc6\n/z8HI1P7zWSTE6NkyO1KqW5Oe2TQkiF2obU+BrwOPKaUylRKXYoR+fNCoPZKqWuUUq2c//cBxhGD\n/gxTrueBSUqpjkqpjsAk4tB34coZr74LRBj3YkLuw3DlTOTv2nnOPwHnANdorWuDNE10f1qSM+L+\n1Fon1QY8AjiAOo/tYed7ZwBVwOnO17Mw6jNVA186P9so2eR07pvglPUo8CzQJE5y5gJvYEyBK4Gb\nPN7rB1R5vF4MHHLK/n/Ar+Itl69Mzn3TgW+dsj0R5/vRkpzx7Dur96LzPqxOhvswHDkT/Lvu7JTx\nmPP81c7vdHiS/a5DyRl1f0r5DEEQBMGPpDMrCYIgCIlHlIMgCILghygHQRAEwQ9RDoIgCIIfohwE\nQRAEP0Q5CIIgCH6IchAEQRD8EOUgCIIg+CHKQRAEQfDj/wGdzYfwBTw/sQAAAABJRU5ErkJggg==\n", "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "y_pred_idx = y_pred.reshape(-1) # a 1D array rather than a column vector\n", "plt.plot(X_test[y_pred_idx, 1], X_test[y_pred_idx, 2], 'go', label=\"Positive\")\n", "plt.plot(X_test[~y_pred_idx, 1], X_test[~y_pred_idx, 2], 'r^', label=\"Negative\")\n", "plt.legend()\n", "plt.show()" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Now that's much, much better! Apparently the new features really helped a lot." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Let's open tensorboard, find the latest run and look at the learning curve:" ] }, { "cell_type": "code", "execution_count": 143, "metadata": {}, "outputs": [], "source": [ "%tensorboard --logdir {root_logdir}" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Now you can play around with the hyperparameters (e.g. the `batch_size` or the `learning_rate`) and run training again and again, comparing the learning curves. You can even automate this process by implementing grid search or randomized search. Below is a simple implementation of a randomized search on both the batch size and the learning rate. For the sake of simplicity, the checkpoint mechanism was removed." ] }, { "cell_type": "code", "execution_count": 144, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Iteration 0\n", " logdir: tf_logs/logreg-run-20170606195328/\n", " batch size: 19\n", " learning_rate: 0.00443037524522\n", " training: .....................\n", " precision: 0.979797979798\n", " recall: 0.979797979798\n", "Iteration 1\n", " logdir: tf_logs/logreg-run-20170606195605/\n", " batch size: 80\n", " learning_rate: 0.00178264971514\n", " training: .....................\n", " precision: 0.969696969697\n", " recall: 0.969696969697\n", "Iteration 2\n", " logdir: tf_logs/logreg-run-20170606195646/\n", " batch size: 73\n", " learning_rate: 0.00203228544324\n", " training: .....................\n", " precision: 0.969696969697\n", " recall: 0.969696969697\n", "Iteration 3\n", " logdir: tf_logs/logreg-run-20170606195730/\n", " batch size: 6\n", " learning_rate: 0.00449152382514\n", " training: .....................\n", " precision: 0.980198019802\n", " recall: 1.0\n", "Iteration 4\n", " logdir: tf_logs/logreg-run-20170606200523/\n", " batch size: 24\n", " learning_rate: 0.0796323472178\n", " training: .....................\n", " precision: 0.980198019802\n", " recall: 1.0\n", "Iteration 5\n", " logdir: tf_logs/logreg-run-20170606200726/\n", " batch size: 75\n", " learning_rate: 0.000463425058329\n", " training: .....................\n", " precision: 0.912621359223\n", " recall: 0.949494949495\n", "Iteration 6\n", " logdir: tf_logs/logreg-run-20170606200810/\n", " batch size: 86\n", " learning_rate: 0.0477068184194\n", " training: .....................\n", " precision: 0.98\n", " recall: 0.989898989899\n", "Iteration 7\n", " logdir: tf_logs/logreg-run-20170606200851/\n", " batch size: 87\n", " learning_rate: 0.000169404470952\n", " training: .....................\n", " precision: 0.888888888889\n", " recall: 0.808080808081\n", "Iteration 8\n", " logdir: tf_logs/logreg-run-20170606200932/\n", " batch size: 61\n", " learning_rate: 0.0417146119941\n", " training: .....................\n", " precision: 0.980198019802\n", " recall: 1.0\n", "Iteration 9\n", " logdir: tf_logs/logreg-run-20170606201026/\n", " batch size: 92\n", " learning_rate: 0.000107429229684\n", " training: .....................\n", " precision: 0.882352941176\n", " recall: 0.757575757576\n" ] } ], "source": [ "from scipy.stats import reciprocal\n", "\n", "n_search_iterations = 10\n", "\n", "for search_iteration in range(n_search_iterations):\n", " batch_size = np.random.randint(1, 100)\n", " learning_rate = reciprocal(0.0001, 0.1).rvs(random_state=search_iteration)\n", "\n", " n_inputs = 2 + 4\n", " logdir = log_dir(\"logreg\")\n", " \n", " print(\"Iteration\", search_iteration)\n", " print(\" logdir:\", logdir)\n", " print(\" batch size:\", batch_size)\n", " print(\" learning_rate:\", learning_rate)\n", " print(\" training: \", end=\"\")\n", "\n", " reset_graph()\n", "\n", " X = tf.placeholder(tf.float32, shape=(None, n_inputs + 1), name=\"X\")\n", " y = tf.placeholder(tf.float32, shape=(None, 1), name=\"y\")\n", "\n", " y_proba, loss, training_op, loss_summary, init, saver = logistic_regression(\n", " X, y, learning_rate=learning_rate)\n", "\n", " file_writer = tf.summary.FileWriter(logdir, tf.get_default_graph())\n", "\n", " n_epochs = 10001\n", " n_batches = int(np.ceil(m / batch_size))\n", "\n", " final_model_path = \"./my_logreg_model_%d\" % search_iteration\n", "\n", " with tf.Session() as sess:\n", " sess.run(init)\n", "\n", " for epoch in range(n_epochs):\n", " for batch_index in range(n_batches):\n", " X_batch, y_batch = random_batch(X_train_enhanced, y_train, batch_size)\n", " sess.run(training_op, feed_dict={X: X_batch, y: y_batch})\n", " loss_val, summary_str = sess.run([loss, loss_summary], feed_dict={X: X_test_enhanced, y: y_test})\n", " file_writer.add_summary(summary_str, epoch)\n", " if epoch % 500 == 0:\n", " print(\".\", end=\"\")\n", "\n", " saver.save(sess, final_model_path)\n", "\n", " print()\n", " y_proba_val = y_proba.eval(feed_dict={X: X_test_enhanced, y: y_test})\n", " y_pred = (y_proba_val >= 0.5)\n", " \n", " print(\" precision:\", precision_score(y_test, y_pred))\n", " print(\" recall:\", recall_score(y_test, y_pred))\n", "\n", "file_writer.close()" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "The `reciprocal()` function from SciPy's `stats` module returns a random distribution that is commonly used when you have no idea of the optimal scale of a hyperparameter. See the exercise solutions for chapter 2 for more details. " ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": true }, "outputs": [], "source": [] } ], "metadata": { "kernelspec": { "display_name": "Python 3", "language": "python", "name": "python3" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.7.10" }, "nav_menu": { "height": "603px", "width": "616px" }, "toc": { "navigate_menu": true, "number_sections": true, "sideBar": true, "threshold": 6, "toc_cell": false, "toc_section_display": "block", "toc_window_display": true } }, "nbformat": 4, "nbformat_minor": 1 }