{ "cells": [ { "cell_type": "markdown", "metadata": {}, "source": [ "# Linear Layer" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "We will be implementing a **Linear Layer** as they are a fundamental operation in DL. The objective of a linear layer is to map a fixed number of inputs to a desired output (whether it be a regression or classification task)\n", "\n" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Forward Pass\n", "\n", "A neural network architecture consists of 2 main layers: first layer (**input**) and last layer (**output**).\n", "\n", "**Node** or **neuron** is the simplest unit of the neural network. Each neuron held a numerical value that will be passed (forward direction in this case) to the next neuron by a mapping. For the sake of simplicity, we will only discuss the **linear neural network** and linear mapping in this lesson.\n", "\n", "Let's consider a simple connection between 2 layers, each has 1 neuron,\n", "\n", "\n", " \n", "We can map the input neuron $x$ to the output neuron $y$ by a linear equation,\n", "\n", "$$y = wx + \\beta$$\n", "\n", "where $w$ is called the **weight** and $\\beta$ is called the **bias term**.\n", "\n", "If we have $n$ input neurons ($n>1$) then the output neuron is the linear combination,\n", "\n", "\n", "\n", "\n", "$$\n", "\\hat{y}=\\beta + x_1w_1+x_2w_2+ \\cdots +x_{n}w_n\n", "$$\n", "\n", "where $w_i$'s are weights corresponding to each map (or arrow). \n", "\n", "Similarly, if there are $m$ output neurons then the ouput is the system of multi-linear equations,\n", "\n", "\n", "\n", "\n", "$$\\hat{y_1}=\\beta_1 + x_1 w_{1,1}+x_2 w_{1,2}+ \\cdots +x_nw_{1,n}$$\n", "\n", "$$\\hat{y_2}=\\beta_2 + x_2 w_{2,1}+x_2 w_{2,2}+ \\cdots +x_nw_{2,n}$$\n", "$$\\vdots$$\n", "$$\\hat{y_m}=\\beta_m + x_n w_{m,1}+x_2 w_{m,2}+ \\cdots +x_nw_{m,n}$$\n", "\n", "Compactedly, it can be written in matrix form\n", "$$\n", "\\hat{Y} \n", "= \\left(\\begin{array}{c} \\hat{y}_{0} \\\\ \\hat{y}_{1} \\\\ \\vdots \\\\ \\hat{y}_{m} \\end{array}\\right)\n", "= \\left(\\begin{array}{ccccc} \n", "\\beta_1 & w_{1,1} & w_{1,2} & \\cdots & w_{1,n} \\\\\n", "\\beta_2 & w_{2,1} & w_{2,2} & \\cdots & w_{2,n} \\\\\n", "\\vdots & \\vdots & \\vdots & \\vdots & \\vdots \\\\\n", "\\beta_m & w_{m,1} & w_{m,2} & \\cdots & w_{m,n}\n", "\\end{array}\\right)\n", "\\cdot \\left(\\begin{array}{c} x_{1} \\\\ x_{1} \\\\ \\vdots \\\\ x_m \\end{array}\\right)\n", "= W \\cdot X\n", "$$\n", "\n", "This logic can be extented further as we increase more layers.\n", "\n", "\n", "\n", "The second layer (and beyond) is called the **hidden layer**. The number of hidden layer is usually decided by the complexity of the problem.\n", "\n", "**Fact:**\n", "\n", "* If the weight $w_i\\neq 0$ for all $i$, then we have a **fully connected neural network.**\n", "\n", "* The number of of neuron for each layers can be different. Moreover, they tend to decrease sequentially. Ex: \n", " $$500 \\text{ neurons} \\rightarrow 100 \\text{ neurons} \\rightarrow 20 \\text{ neurons}$$\n", "\n", "* Most of the practical neural networks are non-linear. This result is achieved by applying a non-linear function on top of the linear combination. This is called the **activation function**." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Backward Pass\n", "\n", "Now that we know how to implement the forward pass, we must next solve how it is that we are going to backpropagate our linear operation. \n", "\n", "Keep in mind that backpropagation is simply the gradient of our latest forward operation (call it $o$) w.r.t. our weight parameters $w$, which, if many intermediate operations have been performed, we attain by the chain-rule\n", "\n", "$$\n", "\\hat{y} = 1w_0+x_1w_1+x_2w_2+x_3w_3\\\\z = \\sigma(\\hat{y}) \\\\\n", "o = L(z,y) \n", "$$\n", "\n", "$$\n", "\\frac{\\partial o}{\\partial w} = \\frac{\\partial o}{\\partial z}*\\frac{\\partial z}{\\partial \\hat{y}}*\\frac{\\partial \\hat{y}}{\\partial w}\n", "$$\n", "\n", "Now, notice that during the backward pass, partial gradients can be classified in two ways:\n", "\n", "1. An **Intermediate operation** ($\\frac{\\partial o}{\\partial z},\\frac{\\partial z}{\\partial \\hat{y}}$) or \n", "2. A **\"Receiver\" operation** ($\\frac{\\partial \\hat{y}}{\\partial w}$)\n", "\n", "Notice that the intermediates have to be calculated to get to our \"Receiver\" operation, which receives a \"step\" operation once its gradient has been calculated.\n", "\n", "In the above example, none of our intermediate operations introduced any new parameters to our model. However, what if they did? Look below\n", "\n", "$$\n", "\\hat{y_1} = 1w_0+x_1w_1+x_2w_2+x_3w_3\\\\z = \\sigma(\\hat{y})\\\\l = z*w_4 \\\\o = L(l,y) \n", "$$\n", "\n", "$$\n", "\\frac{\\partial o}{\\partial w_{0:3}} = \\frac{\\partial o}{\\partial l}*\\frac{\\partial l}{\\partial z}*\\frac{\\partial z}{\\partial \\hat{y}}*\\frac{\\partial \\hat{y}}{\\partial w_{0:3}} \\\\\\frac{\\partial o}{\\partial w_{4}} = \\frac{\\partial o}{\\partial l} * \\frac{\\partial l}{\\partial w_4}\n", "$$\n", "\n", "Given that we now have two operations that introduce parameters to our model, we need to make two backward calculations. More importantly, however, notice that their \"paths\" differ in the way that they take the gradient of $l$ w.r.t. either its parameter $w_4$ or its input $z$\n", "\n", "Clearly, these operations are not equivalent\n", "\n", "$$\n", "\\frac{\\partial l}{\\partial z} \\not= \\frac{\\partial l}{\\partial w_4}\n", "$$\n", "\n", "Despite them originating from the same forward linear operation. \n", "\n", "Hence, this demonstrates that for any forward operation with weights, such as our Linear Layer, we need to implement two different backward operations: the intermediate pass (which takes gradient w.r.t. the input) and the \"Receiver\" pass (which takes gradient w.r.t. operation parameter). For either of these operations, we must integrate the incoming gradient ($\\frac{\\partial z}{\\partial \\hat{y}},\\frac{\\partial o}{\\partial l}$) with our Linear Layer gradient ($\\frac{\\partial \\hat{y}}{\\partial w_{0:3}},\\frac{\\partial l}{\\partial w_4}$)\n", "\n", "Having defined the two types of backward operations, we will now define the general method to compute both calculations on our Linear Layer.\n", "\n", "Assume we have below forward operation\n", "\n", "$$\n", "y=1w_0+2w_1+3w_2+4w_3\n", "$$\n", "\n", "Then, for the backward phase, we need to take the partial derivative w.r.t. to each weight coefficient\n", "\n", "$$\n", "\\frac{\\partial y}{\\partial w} = 1\\frac{\\partial y}{\\partial w_0} + 2\\frac{\\partial y}{\\partial w_1} + 3\\frac{\\partial y}{\\partial w_2} + 4\\frac{\\partial y}{\\partial w_3}=1+2+3+4\n", "$$\n", "\n", "\n", "What about the partial w.r.t. its input?\n", "\n", "$$\n", "\\frac{\\partial y}{\\partial x} = w_0\\frac{\\partial y}{\\partial x_0} + w_1\\frac{\\partial y}{\\partial x_1} + w_2\\frac{\\partial y}{\\partial x_2} + w_3\\frac{\\partial y}{\\partial x_3}=w_0+w_1+w_2+w_3\n", "$$\n", "\n", "\n", "Easy, right? We find that the \"Receiver\" version of our backward pass is equivalent to the input while its intermediate derivative is equal to its weight parameters. \n", "\n", "As a last step, to really be able to generalize these operations to any kind of differentiable architecture, we will show the general procedure to integrate the incoming gradient with our Linear gradient\n", "\n", "**Gradient Generalization w.r.t weights and input**\n", "\n", "\n", "$$\n", "input: \\text{n x f}\n", "$$\n", "\n", "$$\n", "weights: \\text{f x h}\n", "$$\n", "\n", "$$\n", "y: \\text{n x h}\n", "$$\n", "\n", "$$\n", "incoming\\_grad: \\text{n x h}\n", "$$\n", "\n", "$$\n", "grad\\_y\\_wrt\\_weights: \\text{(incoming_grad'*input)' = (h x n * n x f)' = f x h}\n", "$$\n", "\n", "$$\n", "grad\\_y\\_wrt\\_input: \\text{(incoming_grad*weights') = (n x h * h x f) = n x f}\n", "$$\n", "\n", "\n", "Now that we know how to generalize a linear layer, let's implement the above concepts in PyTorch" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Create Linear Layer with PyTorch" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Now we will implement our own Linear Layer in PyTorch using the concepts we defined above. \n", "\n", "**However**, before we begin, we will take a different approach in how we will define our bias\n", "\n", "Initially, we defined a bias column as below:\n", "\n", "$$\n", "\\begin{pmatrix}1 & x_{11} & x_{12} & x_{13} \\\\1 & x_{21} & x_{22} & x_{21} \\\\1 & x_{31} & x_{32} & x_{33} \\\\\\end{pmatrix}\n", "$$\n", "\n", "However, this formulation has some practical problems. For every forward input that we receive, we will have to ***manually add a column bias***. This column addition is a non-differentiable operation and hence, it messes with the entire DL methodology of only operating with differentiable functions. \n", "\n", "Therefore, we will re-formulate the bias as an addition operation of our linear output\n", "\n", "$$\n", "\$$\\begin{pmatrix}1 & x_{11} & x_{12} & x_{13} \\\\1 & x_{21} & x_{22} & x_{21} \\\\1 & x_{31} & x_{32} & x_{33} \\\\\\end{pmatrix}\\begin{pmatrix}w_0 \\\\w_1 \\\\w_2 \\\\w_3\\end{pmatrix}\$$ = \n", "\\begin{pmatrix}y_0 \\\\y_1 \\\\y_2 \\end{pmatrix} = \n", "\\begin{pmatrix} x_{11} & x_{12} & x_{13} \\\\ x_{21} & x_{22} & x_{21} \\\\ x_{31} & x_{32} & x_{33} \\\\\\end{pmatrix}\n", "\\begin{pmatrix}w_1 \\\\w_2 \\\\w_3\\end{pmatrix} + \n", "\\begin{pmatrix}w_0 \\\\w_0 \\\\w_0\\end{pmatrix}\n", "$$\n", "\n", "In this sense, our Linear Layer will now be a two-step operation if the bias is included. \n", "\n", "As for the backward pass, the differential of a simple addition will always be 1s. Hence, our forward and backward pass for the bias becomes two simple operations. \n", "\n", "Now, to reduce boilerplate code, we will subclass our Linear operation under PyTorch's torch.autograd.Function. This enables us to do three things:\n", "\n", "i) define and generalize the forward and backward pass \n", "\n", "ii) use PyTorch's \"context manager\" that allows us to save objects from the forward and backward pass and lets us know which forward inputs need gradients (which let us know if we need to apply an Intermediate or \"Receiver\" operation during backward phase)\n", "\n", "iii) Store backward's gradient output to our defined weight parameters" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "#Uncomment this line to install torch library\n", "#!pip install torch" ] }, { "cell_type": "code", "execution_count": 1, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "tensor([[ 0.6623, 0.8345],\n", " [-0.1770, 0.7527]], device='cuda:0')" ] }, "execution_count": 1, "metadata": {}, "output_type": "execute_result" } ], "source": [ "import torch\n", "import torch.nn as nn\n", "\n", "#No Nvidia graphic card\n", "torch.rand((2,2))\n", "\n", "# Nvidia graphic card\n", "torch.randn((2,2)).cuda()" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### What do the codes above do?\n", "The import command will load the torch library into your notebook. \n", "torch.rand((m,n)) will create a matrix size m x n filled with random values in range [0,1)\n", "\n", "> Note: You will see the output has a type called Tensor which is a matrix used for storing arbitrary numbers.\n", "\n", "If your computer/laptop does not have Nvidia graphic card, the torch.rand((m,n)).cuda() will yield an error. \n", "\n", "> Note: Having a graphic card with CUDA interface will enable parallel computing capability when building deep learning model which can drastically decrease training time. However, our model can still be trained without it.\n" ] }, { "cell_type": "code", "execution_count": 2, "metadata": {}, "outputs": [], "source": [ "# keep in mind that @staticmethod simply let's us initiate a class without instantiating it\n", "# Remember that our gradient will be of equal dimensions as our weight parameters\n", "\n", "\n", "class Linear_Layer(torch.autograd.Function):\n", " \"\"\"\n", " Define a Linear Layer operation\n", " \"\"\"\n", " @staticmethod\n", " def forward(ctx, input,weights, bias = None):\n", " \"\"\"\n", " In the forward pass, we feed this class all necessary objects to \n", " compute a linear layer (input, weights, and bias)\n", " \"\"\"\n", " # input.dim = (B, in_dim)\n", " # weights.dim = (in_dim, out_dim)\n", " \n", " # given that the grad(output) wrt weight parameters equals the input,\n", " # we will save it to use for backpropagation\n", " ctx.save_for_backward(input, weights, bias)\n", " \n", " \n", " # linear transformation\n", " # (B, out_dim) = (B, in_dim) * (in_dim, out_dim)\n", " output = torch.mm(input, weights)\n", " \n", " if bias is not None:\n", " # bias.shape = (out_dim)\n", " \n", " # expanded_bias.shape = (B, out_dim), repeats bias B times\n", " expanded_bias = bias.unsqueeze(0).expand_as(output)\n", " \n", " # element-wise addition\n", " output += expanded_bias\n", " \n", " return output\n", "\n", " \n", " # incoming_grad represents the incoming gradient that we defined on the \"Backward Pass\" section\n", " # incoming_grad.shape == output.shape == (B, out_dim)\n", " \n", " @staticmethod\n", " def backward(ctx, incoming_grad):\n", " \"\"\"\n", " In the backward pass we receive a Tensor (output_grad) containing the \n", " gradient of the loss with respect to our f(x) output, \n", " and we now need to compute the gradient of the loss\n", " with respect to our defined function.\n", " \"\"\"\n", " # incoming_grad.shape = (B, out_dim)\n", " \n", " # extract inputs from forward pass\n", " input, weights, bias = ctx.saved_tensors \n", " \n", " # assume none of the inputs need gradients\n", " grad_input = grad_weight = grad_bias = None\n", " \n", " \n", " # we will figure out which forward inputs need grads\n", " # with ctx.needs_input_grad, which stores True/False\n", " # values in the order that the forward inputs came \n", " \n", " # in each of the below gradients, \n", " # we need to return as many parameters as we used during forward pass\n", "\n", " \n", " # if input requires grad\n", " if ctx.needs_input_grad[0]:\n", " # (B, in_dim) = (B, out_dim) * (out_dim, in_dim)\n", " grad_input = incoming_grad.mm(weights.t())\n", " \n", " # if weights require grad\n", " if ctx.needs_input_grad[1]:\n", " # (out_dim, in_dim) = (out_dim, B) * (B, in_dim) \n", " grad_weight = incoming_grad.t().mm(input)\n", " \n", " # if bias requires grad\n", " if bias is not None and ctx.needs_input_grad[2]:\n", " # below operation is equivalent of doing it the \"long\" way\n", " # given that bias grads = 1,\n", " # torch.ones((1,B)).mm(incoming_grad) \n", " # (out) = (1,B)*(B,out_dim)\n", " grad_bias = incoming_grad.sum(0)\n", " \n", " \n", " \n", " \n", " # below, if any of the grads = None, they will simply be ignored\n", " \n", " # add grad_output.t() to match original layout of weight parameter\n", " return grad_input, grad_weight.t(), grad_bias\n", " \n", " " ] }, { "cell_type": "code", "execution_count": 59, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "forward output: \n", "tensor([[0.7532, 0.5865, 0.9564]], grad_fn=)\n", "----------------------------------------------------------------------\n", "forward output shape: \n", "torch.Size([1, 3])\n" ] } ], "source": [ "# test forward method\n", "\n", "# input_dim & output_dim can be any dimensions (you choose)\n", "input_dim = 1\n", "output_dim = 2\n", "dummy_input= torch.ones((input_dim, output_dim)) # input that will be fed to model\n", "\n", "# create a random set of weights that matches the dimensions of input to perform matrix multiplication\n", "final_output_dim = 3 # can be set to any integer > 0\n", "dummy_weight = nn.Parameter(torch.randn((output_dim, final_output_dim))) # nn.Parameter registers weights as parameters of the model\n", "\n", "# feed input and weight tensors to our Linear Layer operation\n", "output = Linear_Layer.apply(dummy_input, dummy_weight)\n", "print(f\"forward output: \\n{output}\")\n", "print('-'*70)\n", "print(f\"forward output shape: \\n{output.shape}\")" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Code explanation\n", "\n", "We first create a 1D Tensor of size two and initialize it with value 1 dummy_input = tensor(([1.,1.])).\n", "We then a wrap a tensor filled with random values under nn.Parameter with dimensions (2,3) that represents the weights of our Linear Layer operation. \n", "\n", "> NOTE: We wrap our weights under nn.Parameter because when we implement our Linear Layer to any Deep Learning architecture, the wrapper will automagically register our weight tensor as a model parameter to make for easy extraction by just calling model.parameters(). Without it, the model will not be able to differentiate parameter from inputs.\n", "\n", "After that, we obtain the output for forward propagration using the apply method providing the input and the weight. The apply function will call the forward function defined in the class Linear_Layer and return the result for forward propagration.\n", "\n", "We then check the result and the shape of our output to make sure the calculation is done correctly.\n", "At this point, if we check the gradient of dummy_weight, we will see nothing since we need to propagate backward to obtain the gradient of the weight. " ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "print(f\"Weight's gradient {dummy_weight.grad}\")" ] }, { "cell_type": "code", "execution_count": 60, "metadata": { "scrolled": true }, "outputs": [], "source": [ "# test backward pass\n", "\n", "## calculate gradient of subsequent operation w.r.t. defined weight parameters\n", "incoming_grad = torch.ones((1,3)) # shape equals output dims\n", "output.backward(incoming_grad) # calculate parameter gradients" ] }, { "cell_type": "code", "execution_count": 61, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "tensor([[1., 1., 1.],\n", " [1., 1., 1.]])" ] }, "execution_count": 61, "metadata": {}, "output_type": "execute_result" } ], "source": [ "# extract calculated gradient \n", "dummy_weight.grad " ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Now that we have our forward and backward method defined, let us define some important concepts. \n", "\n", "By nature, Tensors that require gradients (such as parameters) automatically \"record\" a history of all the operations that have been applied to them. \n", "\n", "For example, our above forward output contains the method grad_fn=, which tells us that our output is the result of our defined Linear Layer operation, which its history began with dummy_weight.\n", "\n", "As such, once we call output.backward(incoming_grad), PyTorch automatically, from the last operation to the first, calls the backward method in order to compute the chain-gradient that corresponds to our parameters.\n", "\n", "To truly understand what is going on and how PyTorch simplifies the backward phase, we will show a more extensive example where we manually compute the gradient of our paramters with our own defined backward() methods" ] }, { "cell_type": "code", "execution_count": 62, "metadata": {}, "outputs": [], "source": [ "class Linear_Layer_():\n", " def __init__(self):\n", " ''\n", "\n", " def forward(self, input,weights, bias = None):\n", " self.input = input\n", " self.weights = weights\n", " self.bias = bias\n", " \n", " output = torch.mm(input, weights)\n", " \n", " if bias is not None:\n", " # bias.shape = (out_dim)\n", " \n", " # expanded_bias.shape = (B, out_dim), repeats bias B times\n", " expanded_bias = bias.unsqueeze(0).expand_as(output)\n", " \n", " # element-wise addition\n", " output += expanded_bias\n", " \n", " return output\n", "\n", " def backward(self, incoming_grad):\n", "\n", " # extract inputs from forward pass\n", " input = self.input\n", " weights = self.weights\n", " bias = self.bias\n", " \n", " grad_input = grad_weight = grad_bias = None\n", " \n", " # if input requires grad\n", " if input.requires_grad:\n", " grad_input = incoming_grad.mm(weights.t())\n", " \n", " # if weights require grad\n", " if weights.requires_grad:\n", " grad_weight = incoming_grad.t().mm(input)\n", " \n", " # if bias requires grad\n", " if bias.requires_grad:\n", " grad_bias = incoming_grad.sum(0)\n", " \n", " return grad_input, grad_weight.t(), grad_bias" ] }, { "cell_type": "code", "execution_count": 95, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "outpu1.shape: torch.Size([1, 3])\n", "--------------------------------------------------\n", "outpu2.shape: torch.Size([1, 5])\n", "--------------------------------------------------\n", "outpu3.shape: torch.Size([1, 1])\n" ] } ], "source": [ "# manual forward pass\n", "\n", "input= torch.ones((1,2)) # input \n", "\n", "# define weights for linear layers\n", "weight1 = nn.Parameter(torch.randn((2,3))) \n", "weight2 = nn.Parameter(torch.randn((3,5))) \n", "weight3 = nn.Parameter(torch.randn((5,1))) \n", "\n", "# define bias for Linear layers\n", "bias1 = nn.Parameter(torch.randn((3))) \n", "bias2 = nn.Parameter(torch.randn((5))) \n", "bias3 = nn.Parameter(torch.randn((1))) \n", "\n", "# define Linear Layers\n", "linear1 = Linear_Layer_()\n", "linear2 = Linear_Layer_()\n", "linear3 = Linear_Layer_()\n", "\n", "\n", "# define forward pass\n", "output1 = linear1.forward(input, weight1,bias1)\n", "output2 = linear2.forward(output1, weight2,bias2)\n", "output3 = linear3.forward(output2, weight3,bias3)\n", "\n", "print(f\"outpu1.shape: {output1.shape}\")\n", "print('-'*50)\n", "print(f\"outpu2.shape: {output2.shape}\")\n", "print('-'*50)\n", "print(f\"outpu3.shape: {output3.shape}\")" ] }, { "cell_type": "code", "execution_count": 96, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "input_grad1.shape: torch.Size([1, 5])\n", "--------------------------------------------------\n", "weight_grad1.shape: torch.Size([5, 1])\n", "--------------------------------------------------\n", "bias_grad1.shape: torch.Size([1])\n" ] } ], "source": [ "# manual backward pass\n", "\n", "# compute intermediate and receiver backward pass\n", "input_grad1, weight_grad1, bias_grad1 = linear3.backward(torch.tensor([[1.]]))\n", "\n", "print(f\"input_grad1.shape: {input_grad1.shape}\")\n", "print('-'*50)\n", "print(f\"weight_grad1.shape: {weight_grad1.shape}\")\n", "print('-'*50)\n", "print(f\"bias_grad1.shape: {bias_grad1.shape}\")" ] }, { "cell_type": "code", "execution_count": 97, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "input_grad2.shape: torch.Size([1, 3])\n", "--------------------------------------------------\n", "weight_grad2.shape: torch.Size([3, 5])\n", "--------------------------------------------------\n", "bias_grad2.shape: torch.Size([5])\n" ] } ], "source": [ "# compute intermediate and receiver backward pass\n", "input_grad2, weight_grad2, bias_grad2 = linear2.backward(input_grad1)\n", "\n", "print(f\"input_grad2.shape: {input_grad2.shape}\")\n", "print('-'*50)\n", "print(f\"weight_grad2.shape: {weight_grad2.shape}\")\n", "print('-'*50)\n", "print(f\"bias_grad2.shape: {bias_grad2.shape}\")" ] }, { "cell_type": "code", "execution_count": 98, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "input_grad3: None\n", "--------------------------------------------------\n", "weight_grad3.shape: torch.Size([2, 3])\n", "--------------------------------------------------\n", "bias_grad3.shape: torch.Size([3])\n" ] } ], "source": [ "# compute receiver backward pass\n", "input_grad3, weight_grad3, bias_grad3 = linear1.backward(input_grad2)\n", "\n", "print(f\"input_grad3: {input_grad3}\")\n", "print('-'*50)\n", "print(f\"weight_grad3.shape: {weight_grad3.shape}\")\n", "print('-'*50)\n", "print(f\"bias_grad3.shape: {bias_grad3.shape}\")" ] }, { "cell_type": "code", "execution_count": 99, "metadata": {}, "outputs": [], "source": [ "# now, add gradients to the corresponding parameters\n", "weight1.grad = weight_grad3\n", "weight2.grad = weight_grad2\n", "weight3.grad = weight_grad1\n", "\n", "bias1.grad = bias_grad3\n", "bias2.grad = bias_grad2\n", "bias3.grad = bias_grad1" ] }, { "cell_type": "code", "execution_count": 100, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "weight1.grad = \n", "tensor([[-0.9869, 0.0548, 0.3107],\n", " [-0.9869, 0.0548, 0.3107]], grad_fn=)\n", "----------------------------------------------------------------------\n", "weight2.grad = \n", "tensor([[ 2.3822, 0.9312, 2.2510, -1.0365, 3.1596],\n", " [ 1.3770, 0.5383, 1.3011, -0.5992, 1.8263],\n", " [-1.3396, -0.5237, -1.2658, 0.5829, -1.7767]], grad_fn=)\n", "----------------------------------------------------------------------\n", "weight3.grad = \n", "tensor([[-6.3651],\n", " [-3.5532],\n", " [-5.9865],\n", " [ 0.7347],\n", " [ 5.3876]], grad_fn=)\n", "----------------------------------------------------------------------\n", "bias1.grad = \n", "tensor([-0.9869, 0.0548, 0.3107], grad_fn=)\n", "----------------------------------------------------------------------\n", "bias2.grad = \n", "tensor([ 0.6981, 0.2729, 0.6597, -0.3038, 0.9260], grad_fn=)\n", "----------------------------------------------------------------------\n", "bias3.grad = \n", "tensor([1.])\n" ] } ], "source": [ "# inspect manual calculated gradients\n", "\n", "print(f\"weight1.grad = \\n{weight1.grad}\")\n", "print('-'*70)\n", "print(f\"weight2.grad = \\n{weight2.grad}\")\n", "print('-'*70)\n", "print(f\"weight3.grad = \\n{weight3.grad}\")\n", "print('-'*70)\n", "\n", "print(f\"bias1.grad = \\n{bias1.grad}\") \n", "print('-'*70)\n", "print(f\"bias2.grad = \\n{bias2.grad}\")\n", "print('-'*70)\n", "print(f\"bias3.grad = \\n{bias3.grad}\")" ] }, { "cell_type": "code", "execution_count": 101, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "tensor([0.])" ] }, "execution_count": 101, "metadata": {}, "output_type": "execute_result" } ], "source": [ "# now, we take our \"step\"\n", "lr = .01\n", "\n", "# perform \"step\" on weight parameters\n", "weight1.data.add_(weight1.grad, alpha = -lr) # ==weight1.data+weight1.grad*-lr\n", "weight2.data.add_(weight2.grad, alpha = -lr)\n", "weight2.data.add_(weight2.grad, alpha = -lr)\n", "\n", "# perform \"step\" on bias parameters\n", "bias1.data.add_(bias1.grad, alpha = -lr)\n", "bias2.data.add_(bias2.grad, alpha = -lr)\n", "bias2.data.add_(bias2.grad, alpha = -lr)\n", "\n", "# now that the step has been performed, zero out gradient values\n", "weight1.grad.zero_()\n", "weight2.grad.zero_()\n", "weight3.grad.zero_()\n", "\n", "bias1.grad.zero_()\n", "bias2.grad.zero_()\n", "bias3.grad.zero_()\n", "\n", "# get ready for the next forward pass" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Phew! We have now officially performed a \"step\" update! Let's review what we did:\n", "\n", "**1. Defined all needed forward and backward operations**\n", "\n", "**2. Created a 3-layer model**\n", "\n", "**3. Calculated forward pass**\n", "\n", "**4. Calculated backward pass for all parameters**\n", "\n", "**5. Performed step**\n", "\n", "**6. zero-out gradients**\n", "\n", "Of coarse, we could have simplified the code by creating a list like structure and loop all needed operations. \n", "\n", "However, for sake of clarity and understanding, we layed out all the steps in a logical manner. \n", "\n", "Now, how can the **equivalent of the forward and backward operations be performed in PyTorch?**" ] }, { "cell_type": "code", "execution_count": 103, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "outpu1.shape: torch.Size([1, 3])\n", "--------------------------------------------------\n", "outpu2.shape: torch.Size([1, 5])\n", "--------------------------------------------------\n", "outpu3.shape: torch.Size([1, 1])\n" ] } ], "source": [ "# PyTorch forward pass\n", "\n", "input= torch.ones((1,2)) # input \n", "\n", "# define weights for linear layers\n", "weight1 = nn.Parameter(torch.randn((2,3))) \n", "weight2 = nn.Parameter(torch.randn((3,5))) \n", "weight3 = nn.Parameter(torch.randn((5,1))) \n", "\n", "# define bias for Linear layers\n", "bias1 = nn.Parameter(torch.randn((3))) \n", "bias2 = nn.Parameter(torch.randn((5))) \n", "bias3 = nn.Parameter(torch.randn((1))) \n", "\n", "# define Linear Layers\n", "output1 = Linear_Layer.apply(input,weight1,bias1)\n", "output2 = Linear_Layer.apply(output1, weight2, bias2)\n", "output3 = Linear_Layer.apply(output2, weight3, bias3)\n", "\n", "\n", "\n", "print(f\"outpu1.shape: {output1.shape}\")\n", "print('-'*50)\n", "print(f\"outpu2.shape: {output2.shape}\")\n", "print('-'*50)\n", "print(f\"outpu3.shape: {output3.shape}\")" ] }, { "cell_type": "code", "execution_count": 104, "metadata": {}, "outputs": [], "source": [ "# calculate all gradients with PyTorch's \"operation history\"\n", "# it essentially just calls our defined backward methods in \n", "# the order of applied operations (such as we did above)\n", "output3.backward()" ] }, { "cell_type": "code", "execution_count": 105, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "weight1.grad = \n", "tensor([[ 0.2195, -3.4776, 3.3395],\n", " [ 0.2195, -3.4776, 3.3395]])\n", "----------------------------------------------------------------------\n", "weight2.grad = \n", "tensor([[ 2.6869, -0.6504, 1.1048, -1.9001, 3.5497],\n", " [ 1.7754, -0.4298, 0.7300, -1.2555, 2.3455],\n", " [ 1.1182, -0.2707, 0.4598, -0.7908, 1.4773]])\n", "----------------------------------------------------------------------\n", "weight3.grad = \n", "tensor([[ 0.0630],\n", " [ 1.2594],\n", " [-3.3520],\n", " [-1.9508],\n", " [-0.3700]])\n", "----------------------------------------------------------------------\n", "bias1.grad = \n", "tensor([ 0.2195, -3.4776, 3.3395])\n", "----------------------------------------------------------------------\n", "bias2.grad = \n", "tensor([ 1.3815, -0.3344, 0.5681, -0.9770, 1.8251])\n", "----------------------------------------------------------------------\n", "bias3.grad = \n", "tensor([1.])\n" ] } ], "source": [ "# inspect PyTorch calculated gradients\n", "\n", "print(f\"weight1.grad = \\n{weight1.grad}\")\n", "print('-'*70)\n", "print(f\"weight2.grad = \\n{weight2.grad}\")\n", "print('-'*70)\n", "print(f\"weight3.grad = \\n{weight3.grad}\")\n", "print('-'*70)\n", "\n", "print(f\"bias1.grad = \\n{bias1.grad}\") \n", "print('-'*70)\n", "print(f\"bias2.grad = \\n{bias2.grad}\")\n", "print('-'*70)\n", "print(f\"bias3.grad = \\n{bias3.grad}\")" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Now, instead of having to define a weight and parameter bias each time we need a Linear_Layer, we will wrap our operation on PyTorch's nn.Module, which allows us to:\n", "\n", "i) define all parameters (weight and bias) in a single object and \n", "\n", "ii) create an easy-to-use interface to create any Linear transformation of any shape (as long as it is feasible to your memory)" ] }, { "cell_type": "code", "execution_count": 3, "metadata": {}, "outputs": [], "source": [ "class Linear(nn.Module):\n", " def __init__(self, in_dim, out_dim, bias = True):\n", " super().__init__()\n", " self.in_dim = in_dim\n", " self.out_dim = out_dim\n", " \n", " # define parameters\n", " \n", " # weight parameter\n", " self.weight = nn.Parameter(torch.randn((in_dim, out_dim)))\n", " \n", " # bias parameter\n", " if bias:\n", " self.bias = nn.Parameter(torch.randn((out_dim)))\n", " else:\n", " # register parameter as None if not initialized\n", " self.register_parameter('bias',None)\n", " \n", " def forward(self, input):\n", " output = Linear_Layer.apply(input, self.weight, self.bias)\n", " return output" ] }, { "cell_type": "code", "execution_count": 109, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "[Parameter containing:\n", " tensor([[-1.7011]], requires_grad=True),\n", " Parameter containing:\n", " tensor([-0.0320], requires_grad=True)]" ] }, "execution_count": 109, "metadata": {}, "output_type": "execute_result" } ], "source": [ "# initialize model and extract all model parameters\n", "m = Linear(1,1, bias = True)\n", "param = list(m.parameters()) \n", "param" ] }, { "cell_type": "code", "execution_count": 195, "metadata": {}, "outputs": [], "source": [ "# once gradients have been computed and a step has been taken, \n", "# we can zero-out all gradient values in parameters with below\n", "m.zero_grad()" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# MNIST\n", "\n", "We will implement our Linear Layer operation to classify digits on the MNIST dataset. \n", "\n", "This data is often used as an introduction to DL as it has two desired properties:\n", "\n", "1. 60000 records of observations\n", "\n", "2. Binary input (dramatically reduces complexity)\n", "\n", "Given the volumen of data, it may not be very feasible to load all 60000 images at once and feed it to our model. Hence, we will parse our data into batches of 128 to alleviate I/O.\n", "\n", "We will import this data using torchvision and feed it to our DataLoader that enables us to parse our data into batches" ] }, { "cell_type": "code", "execution_count": 4, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "torch.Size([60000, 28, 28])" ] }, "execution_count": 4, "metadata": {}, "output_type": "execute_result" } ], "source": [ "# import trainingMNIST dataset\n", "\n", "import torchvision\n", "from torchvision import transforms\n", "import numpy as np\n", "from torchvision.utils import make_grid \n", "import matplotlib.pyplot as plt\n", "from torch.utils.data import DataLoader\n", "\n", "root = r'C:\\Users\\erick\\PycharmProjects\\untitled\\3D_2D_GAN\\MNIST_experimentation'\n", "train_mnist = torchvision.datasets.MNIST(root = root, \n", " train = True, \n", " transform = transforms.ToTensor(),\n", " download = False, \n", " )\n", "\n", "train_mnist.data.shape" ] }, { "cell_type": "code", "execution_count": 5, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "torch.Size([10000, 28, 28])" ] }, "execution_count": 5, "metadata": {}, "output_type": "execute_result" } ], "source": [ "# import testing MNIST dataset\n", "\n", "eval_mnist = torchvision.datasets.MNIST(root = root, \n", " train = False,\n", " transform = transforms.ToTensor(),\n", " download = False, \n", " )\n", "eval_mnist.data.shape" ] }, { "cell_type": "code", "execution_count": 166, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "" ] }, "execution_count": 166, "metadata": {}, "output_type": "execute_result" }, { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAAcEAAAHRCAYAAAASbQJzAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4xLjEsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy8QZhcZAAAgAElEQVR4nOydebxM9RvH38cSlTVkTdZQRIutslSWX4siov1HJUWLSNKqkhaUSCXayBqVSkWpJKmfhFYpogWlxRbJMr8/xuecuXPnjnuvmTnn3nner5fXuHfOzDxzZ875fp/t8zihUAjDMAzDSEcK+G2AYRiGYfiFLYKGYRhG2mKLoGEYhpG22CJoGIZhpC22CBqGYRhpiy2ChmEYRtpii6BhROA4TkHHcbY5jlM1kccahhFMHOsTNPIyjuNsi/jxEGAnsGffz71CodCk1Ft14DiOMwSoEgqFuvtti2HkZwr5bYBhHAihUKiY/u84zhrgylAo9E5WxzuOUygUCu1OhW2GYQQfC4ca+RrHcYY4jjPNcZwpjuNsBS5xHKe54zgfO46zyXGc9Y7jjHIcp/C+4ws5jhNyHKfavp9f2Hf/m47jbHUcZ5HjONVzeuy++89wHGel4zibHccZ7TjOQsdxumfjPeh1rnEcZ9W+577LcZza+97Hln3vT++hjOM4bziOs9FxnL8cx3nNcZzKEc9X03GcD/c9z1zHcZ5wHOe5iPtPjvj7LHMcp2XEfVc4jrNm32NXO45zwQF8PIbhO7YIGulAJ2AyUBKYBuwGbgDKAicD/wF6xXn8RcAdwGHAj8C9OT3WcZzDgenAgH2v+wPQJIfvoy3QaJ/NtwGPAxcARwLHAV33HVcAGAdU3XffLuDRiOeZAiwEygBDgEt0h+M4RwCvAnftew+3AC/tW1hLAA8DbUOhUPF9dnyew/dgGIHCFkEjHfgwFAq9FgqF9oZCoR2hUGhxKBT6JBQK7Q6FQquBp4BWcR4/IxQKfRoKhXYBkwgvRDk99mxgWSgUmrXvvkeA33P4Ph4MhUJbQ6HQ58A3wFuhUGhNKBT6C5hDeCEkFAptDIVCL+97r1uAoXp/juPUABoCg0Oh0L+hUOgDYHbEa1wGvBoKhebs+3u9BSwnvFEACAH1HccpGgqF1odCoa9z+B4MI1DYImikAz9F/uA4Tl3HcWY7jrPBcZwtwD2EvbOs2BDx/+1AsawOjHNspUg7QuGKtJ+zYXskv0b8f0eMn4sBOI5zqOM44x3H+XHf+3sX7/1VAv4IhUI7Ih4b+fc5ErhwXyh0k+M4m4BmQKV9C+qFQB9gg+M4rzuOc1QO34NhBApbBI10ILoEeizwJVArFAqVAO4EnCTbsB6ooh8cx3GAylkffkDcDFQHmux7f6dF2VHGcZyiEb87IuL/PwHPhkKhUhH/Dg2FQsMAQqHQm6FQqA1QEfie8N/SMPIstgga6UhxYDPwt+M49YifD0wUrwPHO47TwXGcQoRzkuWS9FrFCXuhfzmOU4bwIg9AKBRaBXwB3OU4zkGO45wCnBXx2IlAJ8dx2u7rgyzqOM6pjuNUchyn4j77DwH+Bf7Ga0cxjDyJLYJGOtIf+C+wlbAnMy3ZLxgKhX4FuhEuLPkDqAksJdzXmGgeJlwE9AfwEfBm1P0XAi333X8X4fe/c5+dawgXEt0BbCRc3NOf8LWiIOHCnvX7HnsScG0S7DeMlGHN8obhA47jFATWAV1CodACn22ZSbhoJ17Vq2HkS8wTNIwU4TjOfxzHKek4ThHCntZu4H8+2NHEcZzqjuMUcBznTMKVq7NSbYdhBAFTjDGM1HEK4baJg4CvgI6hUCgZ4dD9UQmYSbgP8Geg5762C8NIOywcahiGYaQtFg41DMMw0hZbBA3DMIy0JW5O0HEci5UahmEYeZpQKJSlGIZ5goZhGEbaYougYRiGkbbYImgYhmGkLbYIGoZhGGmLLYKGYRhG2mKLoGEYhpG22CJoGIZhpC22CBqGYRhpiy2ChmEYRtpii6BhGIaRttgopRRywgknAHDtteFh3JdddhkAEyZMAGD06NF89tln/hhnGIbvPProowBcf/31fPnllwCcffbZAKxdu9Y3u/Iz5gkahmEYaUvceYKpENAuWLAgACVLlsx0nzymQw45BIA6deoA0KdPHwCGDx/OhRdeCMA///wDwAMPPADA3XffnUSrc06jRo149913AShRokTMYzZv3kyZMmVSaVbKOP300wGYNGkSAK1atQLg22+/9c2m3HL77bcD4e9YgQLhfWTr1q0BmD9/vl9mpT3FixcHoFixYgCcddZZABx++OGMGDECgJ07/ZhhvH+qVasGwJIlSwAoVaoUujbrfcyZM8cX2w6Eo446CoDChQvTsmVLAB5//HEA9u7du9/Hz5o1C4ALLrgAgH///TdXdsQT0E5JOLRq1aoAHHTQQQCcdNJJnHLKKUD4wwbo3Lnzfp/n559/BmDUqFEAdOrUia1btwKwfPlyIHgXoSZNmgAwc+ZMd6HXl1u264MtU6YMzZs3B7yTIbcf+v5o2bKlu+C+/PLLSXmNSBo3bgzAp59+mvTXShbdu3cH4JZbbgEynsQ2nNofqlevzs033wzgnjv169fPdFyFChWAcJgxiGzcuBGADz74AIBzzjnHT3NyzTHHHAN458r5558PQIECBahUqRLgnTfZOWf0d3jyyScB6Nu3L1u2bEmozRYONQzDMNKWpHqCxx13HADz5s0DYoc8s4N2DgpD/f333wBMnjyZdevWAfDXX38B/ofXFLo9/vjjAXjhhRcAqFixYqZjv/vuOwAeeughAKZOncqHH34IwB133AHA0KFDk2Jn69atqV27NpB8T7BAgQJUr14d8KICjpNldCKwHHnkkQAUKVLEZ0uyT9OmTQG49NJL3XCUduvipptuAmDdunW0aNECgIkTJwLwySefpMrUbFG3bl0g7BEAXHLJJRQtWhTwvlM//fQT4EVa6tWrR9euXQEvFLdixYrUGZ0NdE3L68Uv999/PwBnnnlmQp9XRYRPP/00CxcuTOhzmydoGIZhpC1J9QS1q/njjz+A7HmC2nlu2rSJU089FfDyYtqdBpmxY8cCuAU78ZC3qET+/Pnz3QKLBg0aJMfAfVx22WUsWrQoqa8hKlasSM+ePQHPMw7aTjwebdq0AeC6667L8PsVK1a45eu//vpryu2KR7du3QCv5L5s2bKup/T+++8DUK5cOQCGDRvmPk7HlC1bFvAKEvxC14wHH3wQ8N6XimAiUWSlffv2gFeD8M0337jvR7dBQ7URDRs29NmSA+Ptt98GMnuCv/32G8888wzgfceic4LNmzd3C+ZSiXmChmEYRtqSVE/wzz//BGDAgAGA1/S5dOlSt8JTLFu2DIC2bdsC4Ri5chc33HBDMs1MCGqEVzlzdM5r/vz5vP7664C3816/fj0Q/ntAOK952mmnxXx8olFpfyoYP368+3/t1vMCqmB+7rnngMyRjGHDhgUmh1OoUPhUVhXuuHHjAC9H/cEHH3DvvfcCuHln5TanT58OQLt27dznC0oVb6dOnQC48sorszxm1apVgHftUE5QOe+8gD4n5cwj0Weq6ElQvnOxeOKJJwB45ZVXMvx+165dbNiwIe5jS5Qo4QoEqJJU6PmS8b00T9AwDMNIW1LSJ6hVXM3iW7dudWPfV1xxBYDbzKoqKYCvvvoKgKuuuioVZuaaRo0aubFwNcIr3v3mm28C4Ryh4t2qcpWHpB6h5cuXu5Ww8iiVN0yUnNqxxx4LQPny5RPyfNkh0oPS3ykv8N///hfIXNmrnJrk7oLAJZdcAmT0usH7e3fr1i1Tf5Xya5EeoHpxn3/++aTZmhPUZxbNmjVrAFi8eDEDBw4EPA9QqJI0L6Aqd0UdBg8e7N6n/2/atAmAxx57LJWm5Yjdu3cDmT+L7NC+fXtKly4d8z59L5MhdpBS7dDIk3Dz5s0Z7lO4Y+rUqUD21AT8RmoIAwYMcC/0v//+O+CFOnUx2bZtG7NnzwZwb+Nx8MEHA9C/f38ALr744oTYrIS1nj+ZaKFVewTAL7/8kvTXTQRly5bl8ssvB7zvoi5C9913n292xWLIkCEMGjQI8DZfagXQhitWg/Ftt92W6XdqJtfGzG9UUKWN8Ny5cwH4/vvvgXDBRVakcqOXKBSyjlwE8zsqvurZs2eW16U777wzaa9v4VDDMAwjbfFtioR2OiooUahQ5eja8QURFRQMHz4cCHtXasxVU6cSuAfqccVKlB8I0l8FL9ycLPT3KV++PCtXrgS8BuagIg3HmTNnZrpv9OjRgBfW9xvtjgcNGuS2EUlfUiHCHTt2uMerqVzhz2jhgiFDhrhajUFBYcLceEaSUcuLFChQIE9Ew3KDolqKXtSsWRMI64tGo4LJXbt2Jc0e8wQNwzCMtMU3T1AFMIr5q/BDpd3vvfee602NGTMGCI5IsYpVIhtCzz33XCB4At7xWLx4cUKeR8VA//nPfwCvSCOy4EK5DuXVgoregwqIwJP9U+O536ixunfv3kD4vJAH2LFjx5iPqVWrljvBQ9EXMWPGDMCT78tLXH/99Rx66KFA5ibsSMGJjz76CCBlAhEHyt69ewNzvcsJiqRceumlgBfZi0StR7Hen3LXEql/4403gIwRjURjnqBhGIaRtvg+WV6Nrhq98eyzzwLhnYR2E9rpqSRdlZd+oXYO7Tznz5+fMA9QTezKBySzaf6www6Le3/Dhg1dezQPsEqVKoAnSXXxxRe7x2i3Juk7lTMXKlTIHQ0VVORBaR4leE3lapWIrmj2C/3tIyXAVNV5+OGHA9CjRw/AG0VTv359V55PO3DdSsousj0paKiZXAIayodGRmOizx2xfv169++xZ8+epNuajsjrVk45t7UMCxYsAOCpp55KjGHZwDxBwzAMI23x3RMUGuej/p8RI0a43ofGCWmUjfq0Ut1zJtm3Ro0aAd5O+tVXX03Ya0QPnFR1VKKQtxYKhdxBlbfeemvMY4899ljXE1UT7Pbt2wH4+uuvAXjmmWfc3K28YYlJq8H14IMPDqxgdrxq0NWrVwPBE8dWJah6+cqVK8cPP/wAZJ03X7dunZtvUfO/elpfe+21pNqbWwoXLuyOY9PnI9v1PV6/fr2b71M+V16jKFiwIOeddx7g5XWTNaw63dH1Il4EKyuPHbxrrDx85QSTSWAWQfHFF18A0LVrVzp06AB4IdJevXoBniagtAJThdodFI5So+60adMO6HnVchFZBq4yfCWIE4WKKdauXctJJ50U99gff/zRDW9o0fv444/3+xpqbNaUAi0mQUStBLFOyMjQaJBQcZFCuK+//rob2lZ6QZ+bFEj+/PNPV4hCC4l+Dho6v/7zn//w0ksvZbjv7rvvBrzzY+HChe571++iJ8uXK1fOnXP3448/Ap6KVTIUSBJBrBYJzYMMomKMrtuagqPiOBVs/fPPP1k+Vqph0VNaUoWFQw3DMIy0xYlXhus4TiBqdCMLLMALzbVv397VcUwF0jGcMmUK4OnjRcqC5QR5gJK2GjRokBvilTelnVReQp6x/l7Dhg1zPa6goJC2wmzRifxZs2bRpUuXlNuVLFq2bOmGq+VhaDq7RAD8Rs3S99xzD+BNnwF46623AM/DkDdcrlw5N2Sm1iWFOtXyUb9+fbeFSbzzzjsZjvnrr7/c+zTVxU/27NmTZWhb7TuKzuR1JDmpubPgFXQlKhwaCoWyjM+aJ2gYhmGkLYHLCWqX06VLF3eOljxAoR3QBx98kFrjoshtQYy8EO10peY/a9YsOnfunBjjAkT0bLEgIFm+aNV6tXeoZSe/cPDBB2cqugpKTrBgwYKAJ6hw0003AeGWDUlrKfoiD1DXhtGjR7vFM5pVec011wBhwQ0Iizko/y3JLnkakfKMBxrZSSRPPvmkWwMRjaJE8uTzOu3bt/f19c0TNAzDMNIW3z1BCTqrMkiTpCtUqJDpWDW6qlk+1QKz0eW/qs7L7uT7fv36AV4OULFwyVlJfNtIPmXKlAEyf4ck0bdt27aU25RMgpxblmcjD1BtOL169XI9tWbNmgGeCIBK6IsWLermEFVFHj3LbsuWLW5OUbcXXnghkHFE2Y033pjIt3VABLWlKBLlcNu1a+dW5uZE3kyjykaOHJl443KAeYKGYRhG2uKLJ1ihQgUuuugiAPr06QN4TcuxUDO2muQT2ZyeE6LlpuStjho1imeeeQbwKpy0c5X0W8OGDV3JMfUqaXeuAaj5DXnMtWvXDpRw8bPPPus27Eajxuv8ht95l3hED0xVjnDAgAFu72ytWrViPnbw4MFuD2BOJNGUY9Rt0Bg9erQbHdOoIaHI0+jRo92+0FTSokULwBPZaNu2rZtHjTdRXv2c8uIlPxkpbiBPMpmC2dGkZBHUhGfp/o0ePZq6devGfcwnn3zCsGHDAK/xN2jztXSy9u7d2y1okSqHGvoj0UKg0EEypyUHAW0WslpwUo0Kktq2bet+l1ROrzBo0NRhEkX0hTRIbNiwAfDEFdQ61LBhQ/cYlcqrGE7FVmvWrMm3eqCa91mjRo0Mv/f7OqiWmkhRgptvvhmIPy9U4iZqZYluAXn//fd54oknAK+oKRUE4+pkGIZhGD6QFE9Qbu/YsWMBbwcevaOJRGEouchz5sxJqUucHeTJaQ6fyrTBC43K6xUKj06dOjXbBTT5jebNm7vyXX6iOXyRn5HECVSUkV9ZsGBBXM1GP5EcmArN5Cn89ttvbppBzezppPmpSQqSjwwyakvJCZKdlHbtDTfcEFdeLVmYJ2gYhmGkLQnzBJs2bQqEk9lNmjQBoHLlylkeLy9Pqu6aFBHkmWaaiiBFejWzquUhEr0vTWpQI286kcxZiEbO+OKLL9zvoCIyyhNqGoVfKI80ceLEDLfpjkRBvvnmGwDq1avnpzkualO59tprAW/eZjxWrVrltr5oZuC4ceMAT3zbL8wTNAzDMNKWhAloa+xMpOit0E5Gsd89e/YwfPhwwJNBMvIPkhxTPmfcuHFZSkClEuVtp02bximnnALgzuHLqgQ/P6HPZfz48YA3/1Gl+PlFkNlIDari7d69O0OGDAE8GUJV77799ttAuMJfVcB+YALahmEYhhGDPDFKyTCMA6dEiRIATJ8+HYA2bdoAuINre/ToEeicvGHklnieoC2ChpFmaDGUApPK24899lgLiRr5EguHGoZhGEYMzBM0DMMw8jXmCRqGYRhGDGwRNAzDMNIWWwQNwzCMtMUWQcMwDCNtsUXQMAzDSFtsETQMwzDSFlsEDcMwjLTFFkHDMAwjbUnKZHnDMAy/OOqoowB46623AChYsCBHHnmknyYZAcY8QcMwDCNtMU/QMIx8wejRowHo1q0bAIcddhgAr7/+um82GcHHPEHDMAwjbTEB7YAzb948ABwnrP962mmn+WLH0UcfDcDZZ58NQM+ePQFYvHgxy5Yty3DsyJEjAfj3339TaKGRjpQvX96dh9isWTMAdE378ssvATj99NP5448//DHQCAQmoG0YhmEYMfA9J1i4cGEATjrpJACGDh0KwMknn+ybTUHgkUceAby/y4QJE3yxo1evXgAMGzYMgGLFimW4v2bNmlxwwQUZfvfpp58C8O6776bAwvxNsWLF3BzXP//8A8AJJ5wAQPHixQG4+OKLef/99wH45ZdfsnyuDRs2ADBr1izA+5zyIqoAHT58OE2bNs1w36BBgwDv/eUFL1CRnilTpgBw5plnutGXn3/+2Te70gHfw6Fly5YF4LfffgO8E/X44493/59OPPDAAwDccMMNAOzatQuAK6+8EoDp06en1B4VF2ji+OGHH77fx2zatAnAXRznzp2bJOvyPw899BA33XRTQp9z7969gPeZTp06FQhfgH/44YeEvlayaN68OQALFixwf6eF5JJLLgG8BSUvcMghhwCwcuVKACpVqsRVV10FwPjx432zK79g4VDDMAzDiIHv4dBoKlSo4N6moyeo5L7CxB9++CGQeg9Q/PnnnwAMHjwYCIefwNu5/vjjj1StWjXDY0qVKgVA+/btgfztCaoJ++CDDwbgwgsvBOCaa65xj5k9ezYAPXr0yPHzn3feeVnepzDf559/nuUx3377LQB16tRxP5fjjjsOgPr16wMwZMgQAJYvXx54T1Bh0EmTJgGe9wfe30rh3rzE9u3bgYyeYHaiLnmZ/v37A3DQQQcBUK9ePS6++OIMx6xYsQKAY445Jml2mCdoGIZhpC2B8wQjd3Z5mZYtWwJw2223AWEPQV5VVlx44YXu7nzVqlUACc8H5ZYnn3wS8AplGjZsCMCWLVuyfMyYMWOSb5gPtGnTBgh7HvL8SpYsCXjl+ZHIu88N7du3p06dOoDn1Ql5D+vXr8/Wc6mQ5osvvgDI5MGfc845rtcaVC699FLAs/2NN97g6quvBuIXBeUVdM60bt2aunXr+mxN4mjVqpV7bWvVqhUAnTp1AjJe86PPn9q1awNe/lrFQonEPEHDMAwjbQlcdag4+eSTWbRoUbJfPmkolq2dTKtWrdz8XlZ8+eWXbuxb+Y2XX345iVbmnC5dugBw6623AtCoUaMsj9V7+eabb5JvWBJRdV6DBg0AaNy4caZjtm7dCni5KpXnT5482W1t8JuLLroIgBdeeCHD73fu3AmEoxeLFy9OuV3Z4aOPPgK879u6desAOOOMM/juu+98syvRHHHEEQCsXbvWFZuoXr06kH2P3w8qVqwIeBW5NWrUyHB/yZIlOfTQQwHP81uyZAkQ7gTYH/LycyuEHq86NHDhUHHCCSfk6UVQoSptMooWLZrlsTqxq1at6pavxzveT2bMmAF4BTtz5sxxF4do7rnnHgDOP//81BiXQMqUKcP9998PwOWXXw54RUI6eR944AFXlWTHjh1AuFAoSKjoYNSoUVx22WUxj1Ev6tKlS1NmV3Y599xzAdxeQJ1PL774IuD93fMbjuO4n90555wDwNixY/00KUvatGnDuHHjAG8Rj4dCmr///jvgOUKVKlXi2WefBaBKlSoZHqNwaDKwcKhhGIaRtvjuCe7evRuAzZs3A16BQc2aNX2z6UC49957AS90prDo8uXLMx2r8MDAgQOBcNvBxx9/DHgeV9BQCfOxxx4LeGX2sVi4cGFKbEoGd9xxB1dccQXgTSdQkdO2bdt8syu7SGNWjePdu3d375MAw/XXXw8EN1xdqlQpWrRoEfO+v/76C4ivpiLBiUjvJCiFZvsjMk0ljzCo3HzzzVl6gAq1Dxw4kE8++QTIXOClVp8bbrghkwe4Zs0awCuISgbmCRqGYRhpi++eoCS2JH+kKQV5De2ENF1BHm6fPn0A2LhxY6bHPPzww4CXM1u3bl3gNFNVpi2l/lq1agFQqND+vzqvvvpq8gxLEGr6lzeuHWffvn157733gHDeEwhMgUs8mjRpAng2FyxYMNMx8jJ++uknAPbs2ZMi63LGnj17XJ3UAgXC+3XlzD/44INMx/fr1w/w3t91110HZCymUIO2PI780FbhF+3atQNitwApN67zKTtRoWgvEDzhA+UPk4F5goZhGEba4rsnmB9o0KCB6ymp0kl5pPnz52c6XnmJyDwNwH333ZdEK3NHvXr1AK9MOzseoOjbty/g5Z6CyO233w54nqDk6ebOnZsnPL9ounbtCsT2AIVyTJq4rnaO1157jVdeeQXwGur9pFWrVm5OUB6gPIzIyRCqrj7llFMAr5pS/P33327uUMIDyrlL5H3t2rVJeQ/5GXnViqaA18py9913A/E9wNKlSwPhNhfwBEYin+eNN95IoMWxMU/QMAzDSFsC6wmWKVPGbxOyRN6QKu+efvrpTDkLjXpRU/mIESPcsUTKAappVLMCg9gHpGZ9eUoa9ZSdPkY10AYZzZ5THknNvnnRCwQvdysPXo39ilDE4sQTT3Rv77rrLgBGjhwJhEc5QWYxi2QieTdFH8BrFJ84cSKA2yB/1FFHMWDAAMDrKVT+6O233wbC516JEiUAb8alqtCDiuM4MSX4gsRTTz0FhL9bqu6XIEN2hh9I7k4V9QBfffUV4EU0UjFEIbCLYHRII0gohCIlkVAo5C5+33//PZDxwgLh91O5cmXAWxxULKNm7CAzatQowLv4aCIBeJsChYB1wckL/O9//wO8z+mxxx4Dwk3YuojmJRRGOuusswBPY7Ns2bKUL18e8NSI9L2L1G7UZk5FJipMOf300wFvk5dMFNbUYGnwLrgSYNB7GT58OGeeeSbgqfaokV7hutq1a7vatzpGi2FQw6BBXwABZs6cmeE2u3To0AGAO++8M8Pvd+/e7ToCqZwgZOFQwzAMI23xXTtU3HjjjUA4dAHh6QSR3kYQ6NatG+BpL6oNYtOmTW4YQE28eh9STAdvx62/uW6162ndurU7PSIvofelUJp2eHovbdq0CcSOW9JbS5cudXUZFaJW8c4dd9wBhBviVfod1GbyA0XCB2olUHtFLG655RbAC48mE4XeIwvFoguyVHChzxQ8b1XFaLGmzyvMG9Sm+UjtUHHqqacCsYvs8iJqyYlee3r37u16/InGJssbhmEYRgwCkxOMFh4uXLiw2+QaBC8CvFl6slU71WeeeSbTsdpda2cTq6FUHpSasvOiFwheyX10jF/yXH41Yyv3qlYA5cduvPFG15uXKLZygfIEixUr5pZw51c08WLatGkAvPPOOxnK1CORSEIqUATIcZxMU+LVDlGtWjX3GOX+5CnFmj6vY+QJ5iXy6nUhmqFDhwKZhQ+EX56ueYKGYRhG2hIYT1D5NeE4DkWKFPHJmthoV6oydMlOxUIl6ZqpB7hTyDV+R8QTAc4LRJY4RyIP2a/399lnnwFetapyTdHz9MBr7BfvvPNOps8pv6Jzb8mSJVl6gitXrkylSUA4Z5RVzYK8iFAo5Iq5K0Kj9p0ffvgBgBYtWrgl/IY/HHTQQRx33HFAxs8OPKFzv+ZCmidoGIZhpC2BqQ4VGp5Yt25dt7end+/eqTYj16gJV/nCa665BgjH9ZWrCBoSJpDnphzR5MmT9/vYihUrutWT0f2ByiOtXr06YbbmBDXCSxrt4IMPznSMdp+1a9cGvPxz586dXU8y6FSsWAubQ0wAACAASURBVNEVbtfoLsm/ZQdJrM2ZM8cdwSTkJaryMrLSMlnEquqUfFrDhg0BT7ShWLFi7jHKsatZvkePHkBqpLcSRazqUH0382JuUJJql1xyCY8//jjgfU66vqh+QsMUkkGemiw/d+5cACpXruw27OYltGBLDUFKG9EXlyDx6KOPAl4TqxbrX375xVXZlwiAmqd1zIABAzItfmoPWbduXZItj48mw6tAR+GYNm3auMeo+EUXShVQ6P0GmQoVKgDw1ltvufMrc1LMo4ZznWexvqPa4KRi8RNqX9m+fbt7Ef3www+B+E3k0c3yeWnxi4fEACRGkReQ6o8mznfp0sW9T+1wKkZLhQBDPCwcahiGYaQtgfMERSgUcneEeQG1c1x55ZWAt2NVi0SQi1/GjBkDeFqNCke999577mRnhakVltJOD7z3qlDc4MGDgeDobw4fPtxvE5KCyv3lBYL3GWp6944dO9z7FA6++eabAc8DjPwsFaqSV+XHBJAlS5YA4UIy2di6deuYxz7//PPuxIulS5cCebup/NdffwXC59vRRx/tszW5R7MBIz1AhXMlwRgUzBM0DMMw0pbAeoIlSpSgY8eOgNeSEGQktiyPUGX4khILMosWLcpwK9vHjBnjNiXrNhaSiotsBzGSz7x58wBPcR+8thB5RZGtASraUm40FvIAO3XqBPjrVc2ePZvZs2f79vp+oOhXpAfftm1bIG/kBOvWrQuQqZ5j5cqV7tzAoGGeoGEYhpG2BM4T1K52586dbh4qL/Dcc88B3qiXV1991UdrcodEhSVSEFl+LrkqNfyLzZs3065duxRZaETyzjvvADB16lR3vJeI5+1FozaIkSNHumNxPvnkkwRZaeSGZcuWuZXYkedh0JHsoIYNiMceeyww8pfRmCdoGIZhpC2Ba5afOnUqEJ6MrcG6Qd1BGEYQKFKkiJvDU6+fZM4ih1OreldosKwqSZVHNPynWrVqTJkyBQhXwAKueEgQUT2ARAyU/1N1/KOPPup+z/wgXrN84BZBwzAMI2/x4IMPAp7YhBwXNfr7uQCCzRM0DMMwjJiYJ2gYhmEcENKWnTNnDhDW3gUyzYP0C/MEDcMwDCMG5gkahmEY+RrzBA3DMAwjBrYIGoZhGGmLLYKGYRhG2mKLoGEYhpG22CJoGIZhpC22CBqGYRhpiy2ChmEYRtpii6BhGIaRttgiaBiGYaQttggahmEYaYstgkbCqFGjBjVq1GDatGlMmzaNf//9l3///Ze6dev6bZphGEZMbBE0DMMw0pZCfhtg5H1OOukkAN566y0ANm7cCMCYMWMA+PXXX/0xzMgWRx11FBCeXH7xxRcDsH79ej9NSgitW7cGYN68eRQoUCDD7+bPn++TVUbQ8G0RvPTSSwFo3749AA0bNgSgTp067jEff/wxAB06dABg8+bNqTQx5Rx66KG8//77AFSqVAmAk08+GYA1a9b4ZFV8zjrrLGbMmAGEL6IAt912GwDbt2/3za78QvHixSlWrBjgff8T/XfV9O+WLVty5ZVXAnD//fcDsHv37oS+Viro3r07ANdddx0Ae/fude97+OGHAZgwYQLgbdTy4vvMjwwaNAiA++67D4CHHnoIgFtuuSVpr2nhUMMwDCNtSek8wbJlywIwfvx417vbtGkTAIsWLcpwbKtWrTj00EMBWLFiBQBHH310Is1JOfLuypUrl+H3f/31FwCnnnoqzz77LADffvstAE2aNAFg69atqTIzW9SuXRuAZcuWsWDBAsDzKCJ33saBMWTIEHcXPGDAAAAeeeSRhL5GixYtAHjvvffc36mY6fvvv0/oayUTeYCKMrVs2dK9T+HQ6O9mrVq1AFi7dm0KLMwdRx55JAA33ngjAL179wagUKFCTJ06FYCLLrrIH+MSSPHixd3rXvny5QHYtWsXAH369AHg6aefztVz2zxBwzAMw4hBSnOCKpyoVq2aG+sdNmwYAH/++WeGY+vWrcv//vc/wEvc33nnnQDcc889KbE3NzRo0ADw8hHaxYH3PqpWrZrhMQ888AAQ9nQdJ7xh+eWXXwA46KCDkmtwDilatCgA48aNA+CLL76ga9euQP7wAA877DAAunXrxq233gp4Hry44447ABg6dGhKbbvrrrsAWL16NQCzZs1KyPNq152XKFWqFACNGjVyoyeKsBQpUiTDsStWrHA9QZ2DeYHLL78c8Dz/7777DoBevXoBcMQRR7jfCV0TFTXLSxQqFF6GrrnmmkzfRRXVRUcKE4l5goZhGEbakpKcYNu2bQHPE5w+fToXXnjhfh+n3c3tt98OeHH76tWrJ8KspHD99dcDsfM2O3fuBODFF18E4PTTTwegYsWK7jHyBC+77DIAXnjhheQZmwvkuV977bVAODf4888/+2lSQmjevDngVQ82adKEeOcGwMSJE+nRo0dS7YrMCYotW7YA0K5dOwA+/fTTXD23qk7nzp0LePln8KIuqfZ290fHjh0B6NmzJxD+G2SV7xNXXHGFe4wiGCJoOUFFfvr37+9+BvpO6txTHcXxxx/P4sWLAS+6pAhSXkI5aVXGR6LakTfeeOOAXsNygoZhGIYRg5TkBAsXLgx4lWaqaNof6j+TJ6h8VIkSJQBvRxwEBg8eDHgVfOL5558Hwg3kw4cPd/8P4XwGwJw5c4Bw9azu03sPCsqzXHLJJYC3a8vrXqAqlp966ikA6tWrB4Q/o1deeQXwcm/yzs8//3wAmjVr5u7c//3336TY98MPP2T6nb7/d999NxD+TFRhnBNU4RvpAQYVfe90PkUiLy8rFF3JzrF+o8jCkCFD6Nu3LwCjR4+OeWy7du347bffgLzpAVarVg2ARx99NNN98+bNAzJWLCeLlCyC7777LgDHHXcckP1mX4UPhZKmKgdWc3YQUDvHwQcfDHjhFTWORypwKASjwgsl9Ldv3+5e2P75558UWJ19br75ZsALoel95XW0wGnxU2hQ7R6RaBPXpk0bAKpUqeI+bvny5Umx77nnnnMLc1QEISQ00blzZ8aPH5/j51bRgQptatSo4d43ffr0XNmbaLT4jRw5EvBCnjo/fv31V4oXLw54RU1Cx2zdutXdOAS1eEu233vvvUB4E/zEE0/EPFbFdhI2yKu89tprQMbWNzk2Cv3u2LEj6XYEe1tkGIZhGEkkJZ5gbr0a7VC//vprwNsxKIwTJBS+POOMMwDPs1D7Q+/evSlZsiTgJbrPOusswGsPue+++3j88cdTZ3QOUBHGwoULAfjss8/8NCdhRO80c9J2sGXLFn7//fdEm5SBPXv2MGrUKABX11ORBNGnTx9efvllAP74449sP7ciK5EeYJDo2LGjG/6M9uA++eQTIOyVq0k+uuhFkZaXXnrJPSZoqD1A55XCm9dcc02WUm4qlqtRowYjRoxIgZXJ4ZhjjgHIUICm69/bb7+dMjvMEzQMwzDSlkBPkZBkjm6DzLJlywCvqVOeoNog2rZt67ZNRDfLKw+YVQLcT1S+3KxZMwCOPfbYLI+VQr+Ke7766qvkGpcAVDShWxWYFC1alJo1awKeHNcJJ5wAwIYNG4BwbjoVBQkSzpa3EO0JNmjQgCOOOALI2hM86KCD3CZroQKfoKG/t/KA4EWT5AGqFSkS5WXlPUbm1BSpUWtFUIqBunTpAnhN/KeddhqQWTwEvFoInYvbtm1zi+3yEoqE6ZyTJzhv3jw3J5pKzBM0DMMw0pZAe4Iqy1drhAiamDR4lazRbRtqhJ85c2amnY/EYFWKH0SUh/rmm28AL08runfv7uYlSpcuDXh/i5tuugnwxtUEkei8RL9+/YBws7I8P3HBBRcA/rWvKMrw3//+N9N9avZXREIzHnVbrFgxt9UoHpLdyk3LRaKQLJ0qrsFr2teIp0g+/PBDAN58800g9vzKbdu2AZkrzv1Gn6WEoz/66KNMx1SoUAHwBDjU5jF69Og8NatT1wEJHuic+/zzz4HwtcaPqnjzBA3DMIy0JdCeoJopIwftgie/FomanjWcVzvjF1980d1lpYLsyC9JAkjx/J9++impNh0IEvFVPkI7aTWJ33XXXW6uSU3/6rGTsPGqVatifmZBQDk09ZqdeOKJQDhfoZ2q+lpVpewX6gVU7jVSevCxxx7LcBtNgQIFstUjp1y2duu5HV2TGyQeoc+iQIECFCxYcL+Py8m4J0VjgtI0r15PSaRF1j+ot3HmzJmAd41Tf/SDDz6YMjsPlCZNmrjfKXm2QkIVqiVINYFbBBUCrVKlijtVPRp9CZYsWcLxxx8PeM2mKhBQyLRWrVopKY/WyapCkkiVCjF79mzA08MLOsccc4xbwh1drq2/+1tvvZUpPDht2jQATjnlFCA8LTqoi6DCoSo2qFKlCuC9BwiX2IP/i6BQ+Fnh2eywd+/e/WqhRqK/RyoWwfr16wPexV5h9UQ2tkvkQZs3v5vmVTAnoltz2rdvz9ixYwGvkE6LvVo/gqSYtT8uv/zyDBrJ4KVYEjUNJbcEYztkGIZhGD6QEk9QUmKHH344EC41b9q0KeCVBEcfG2+KvHbvaj4HeOaZZwDP21KYK5b2YjKQHup5550HEHPXnZOdeBCIDFtEh5TV/hCv2EIl6l988UUSrEssH3/8MeDNg4wkaJMUcsP333/vfv90jqj1QqE4v5AYQHTrUCJRK0JQWiPUFK9CEEUeFAouV66cm3pQVEmFJfrc8gLSP73iiisyXf80XWjdunUptysS8wQNwzCMtCUpnqC8OU1WUA6sbt26WT5G8W2VMu/evdvNRwkVBign6Ld0l4SNe/ToQefOnQHP25NtauDt0aOH6wnnRaKnRWSnTSUvTphQfiq7hSRBQ03WP/74I+DlD6dMmZLpWAna++0JZoVE2w+UunXr8tBDD2X43Zo1awD/hOoVHbn66quBsKcE3vViypQpbpGT5kUqR5gXUG2GRL4LFCjAnj17AE/ezm8PUJgnaBiGYaQtSfEE1fytmK9i28pF/PDDD25FkO7Tzkzew4oVK1wpITVoq5FZ3qLfqMLrnnvucX+nHJl2cSoL7tGjR2CqC7OL4zgxq1yzS6tWrYBgihtkhQS19+7d685MTNaswNyyatUqACZMmACEhZRVaScB4gPNw0owXZWafjTP50QMPBaKPM2aNYsyZcoAXi5OOUK/m831GepW59vIkSNdgXNFmYI2Xi0WkvR79dVXgYztbWr2HzhwYOoNi4N5goZhGEbakhRPULtIVWZqJ7N06dKsDdmX/1MDaJUqVdxdW9euXYHgeIBqVlZVG8A555wDwDvvvAN4lZWR+RZ5u3mFUCiUq4rWwoULA16+Y+LEiQm1KxmoSVy5mY0bN7rVrUH73JQ/l5BBMqhcuTLg9dUlk6wa2J999lnXQ8oO6gXUY84991z3PkWTzj77bCBztXNQUPTk2muv5b777gNg8eLFfpqUI+T5RQucgOcdBo2kLIK6cG7atAmIH5qRLuiLL74IeDP2du7c6TYD+10AE43CvGrRmD9/Pq+//jrgLQA62XSM4zhJnz2XaL7++mvWr18PeBO+s5p2Dd571zFS/ImldRkU9PmomV8X/4EDB/qmEZoqdH5qKka0kgd47SFSBcpqxt2BMGTIEMBrE4hsfXrvvfcA75qiNIoWsZtvvtldRLVgqw1CSj9Dhw51BQ+CuviJyZMnA+GikehinryAREuief/99wObDrJwqGEYhpG2JMUTXLlyJeBpAUobTsnp5cuXu+GJAQMGAJ77rHlhvXv3jhs+9RPtSiNv5QWpEObRRx8FvIKC8ePHB3ZqfFasX7/e9QSiJ1hPmjQJgJo1a7ozBiXnpAS+wuJB9oC125YHKNGDvDyxO7tEpyteeukltxhDyIvX/L5keILz5s3LYIfk00qWLEnLli0BT+ZMsoSRKIyqY+bPnw9kLjoJMtKslT7o9ddfH5j0T07Iah7gE0884etkkniYJ2gYhmGkLU68wgfHcQ5I50u7As2Vi6XcrmSphHqDKrQciZpW1Qg6Y8YMdwcdvVOVZ/jaa6+l0MLE06dPHwCGDRsGeELn4LVAqFBIOZ6gtRZE0qZNG8DLMcmLUO7Tb1FfP2jcuLH7PZVHItQOJC8rmcgrv+qqq9yWo3jCBSqgW7BgAeDlL/OCvJhqIjRHUC0p9evX5++///bNrpwiKUu1FSk3ePfddwPhtcBP2chQKJRlr5d5goZhGEbaklQBbU2I1m1+QY3JokuXLm6FmmSrJHarlom8jt5PkKfEZ5dq1aplGJUEXu4rHT1AsXjxYleQQtEbCVxIuisV/PLLL0B4VmVWtQMrVqwAwpEJiQcsXLgwZTYmih49egDeHFTVUeQlLxC80VsSABcSQwny8ADzBA3DMIy0Jak5wfyK4vY9e/YEwp6udsrKcUoiyAgOEnYfNmyY28ivSsRu3br5ZpeRvqh3Th5T48aNgeRU4aaCtWvXAnDIIYcAXoW435X+8XKCtggaaUPv3r0BGD16NIsWLQK8gg9dhAwjlUioQPrDea2NKq9ghTGGYRiGEQPzBI18j2S0JJ319NNPuzPN8uLMQ8MwcoZ5goZhGIYRA/MEDcMwjHyNeYKGYRiGEQNbBA3DMIy0xRZBwzAMI22xRdAwDMNIW2wRNAzDMNIWWwQNwzCMtMUWQcMwDCNtsUXQMAzDSFtsETQMwzDSFlsEDcMwjLQlsIvg5MmTWb16NatXr6Zp06Y0bdrUb5MMwzCMfEZgF0HDMAzDSDaBFdBeuHAhzZs3B+D7778H4JhjjgFg165dfpmVbTp37gxA0aJFATjxxBPp27cvAO+99x4QHukD8M033wDw2WefpdpMwzCMfE+emix/xBFHAOGFr3DhwhnuO/TQQwHYsWNHqs2Ky8EHH0zdunUBuPfeewE47bTTAChSpMh+H//DDz8A8O677zJw4EAAtmzZAsCePXsSbq8R/swA2rdvD8Bdd90FQKNGjQCIdV5cccUVAPz111/u77RB+/LLL5Nn7D46duwIwHXXXQfAqaeeCoDjODHtBXjllVcAePPNN5k7dy4AZcqUAWDlypUAbNu2LXlGG8Y+jj76aADXGahUqRIAZ511FgCzZs3io48+yvCYp556CoBNmzYd0GvbFAnDMAzDiEHgPMEGDRoAsHz5cvd32s126dIFgL1796barAwce+yxALRo0QIIexPazSSKu+++G4CXX34ZgC+++CKhz59MqlatCsCiRYuA8N8nFZ5SNHXq1AFwvetIDjnkEADOP//8A3qNr776CvC+m/KuEok8wAkTJgBeRCSnfPfdd4D33n///XcA/v33X/eYfv36AWTakRvGgfLwww8DcMMNN2T7MYq63HbbbQCMHTs2V69tnqBhGIZhxKCQ3waIQoXCpsTatU+ZMgXw3wMU8gBHjRqV5TE//vgjED+nV7FiRcArnolEOSrt1v3yBI866igA/vnnH8B7X/F44oknAM/D2Lp1a5Ksi8/bb78NQOXKlZP2GirWWrx4MQDPP/88ANdff33CXqNcuXJA7j1AUbt27Qw/x/q7TJs2DYDzzjsP8N6XkXP0een8PvvsswEv75xTdL1RDUFeI6uoy9KlSwH45ZdfMt2n2ooLLrgAyL0nGA/zBA3DMIy0JTCeoOLFF110kc+WZB/lKjt27MiGDRsAGD9+PADDhg0D4lfeyVt45JFHkmlmrunUqZPr2cgzjWerWlratGkDwAMPPADA2rVrk2lmlsirUZ4rks2bNwNeNe9VV10FeJ5vTilWrBgArVu3BjwPUTnDA+Hxxx8/4OfILqrY+/DDDwF45513ALjkkkuAjJWxQaVgwYIA1KhRI9N9imTs3LkzKa990UUXcfLJJwO4t6pzOFDOPPNMwItE/fbbbwl5Xr9QjlrvK/L9lC9fHoBPP/0UgIYNGwLQvXt3Zs+eDcDGjRsTYod5goZhGEba4rsn2LNnT8DrwcoLTJ48GYCJEycC4col5czWrFmT7eeJl2/5+++/gcTtdnLDxRdf7Hq72fFWzz33XMDL786cOTN5xmUDeeXKUUaye/duwPMMXnrpJQAGDBgAhCMT06dPB+DII48E4LDDDtvva8qTUh4vEcgLlzcmPv74YwCeffZZ93cnnngi4J1XolatWq6HlB30Gf7nP/8BoEqVKkDuPcFzzjkHgFdffTVXj4+mRIkSALRr147LL78cgIMOOgjA7S+WxxTJnXfeCcCQIUMSYkc0kyZNcmsXdBsdCVmwYIF7XksoIxb169cHvIhRrVq1gPB5CcGNIGUXXeMiPcDSpUsD3vdX55N4+umn3etK165dE2KHb4tgjx49ABg9ejTgfYGlmnL88cf7Y1g2iL4QqLF9f+jkHDp0KBC/PF8FQi+++GJuTEwIJ598srvQZwd9YR0ny2rklKIFYdKkSfs9Vheqa6+91v1dt27dAHjhhRcAaNKkSZaPl4CDTt73338/5wZngb4vuo3HggULgMwXyGuuucZtjRC33norAKVKldrv86oFJLcFWipSyimHH344AG3btgW8tpdWrVoBGRc6XTtmzZoFeAUpjRs3do/Re07WIrhy5Uo31KrX0GYqJxxxxBG0bNky5n1+pRcSjYRR9D6//PJL3nzzTcA7d6PZu3evuzFPFBYONQzDMNKWpHiCKhJQMlO7N+3Iunbt6rq9Qg2Ub7zxBuAlTfMLp556KjfeeCNA3Mb61atXA16TvB/IoytatGiWclyxkF7q9u3bAa+twi+y4wGqjF0eh4ppwAu5RbcWRKLCp6uvvhrw93OLR6yQsCSp9DcYOXIkZ5xxBuCdw6Jdu3aAF5rNKbmVOpRHLVlCRRn0vXQcxw1lX3PNNYAXXlPJfePGjd3HSXAgWehal1uqV68OhL3H6GiYPNzcetVBQ/J90lKOh9JMDz30kJuOShTmCRqGYRhpS1I8QcV6NSUhuux88+bNjBs3DoDhw4cDXgOoEvD5BeU+x44du9/ChHvuuceNd6vlwg/WrVsHhHOf8hIkBB6vtFyi1MuWLQNg1apV7mOTVZKeW/S+VDyTmyT75s2bufLKKwGvsCYvIREDFSi8++67WUYpRo4cmTK7IpHHJ09Scor33XcfEM4j/fTTT4BXiKLipsGDB7vPowiL5AiDgvK0aiuSdx6rsOqOO+4A/BOfSCXKPasw648//gCSM0HIPEHDMAwjbUmKJ6iyX+UEo3MqW7ZsyZb8ljhQuahUInFttQto9xbLC1TOTHnQCRMmBEoSacaMGfTv3x/wdqaDBg0CvJ11LNSkrEbroUOHBi6PoYrIAymz7tevX570AKNR7jNW3lBtGNnJ2yQDff80skq3sVAbhrw9VYeuWbPG9bTWr1+fNFtzg7xVvc94jBkzBsgowKFm8ueeew7IWYtWKqlTp44bKcqKvXv3uqIVyq0f6Ail7GCeoGEYhpG2JLVPUHmgnIzRUbx7w4YNVKhQAfB2eNrtBI3ChQu7jazq69PPYs+ePZni2WrcHTFiRAqszDn333+/+z7U0yjPSb1P//zzj+v5qQJPnoUao4PmBYKXY9JOOqu+pHgMHTrUzVFJBDgvoYrYWOeV8mu6zy+Jrrfeemu/xygHqPNJHqByhe3btw9UhCWS6OtEPGI1/6uat169eoAnO+n3MG6JLWjw87hx4zJ1BOhaP2fOHCCc5/38889TaGWYwM0TFAsXLnS1KDWJON7UBj+5/fbbs0y4q3l52rRpMcNNQUdhXDWOqw1CpdzgqTxIWUWhYDW+Sp0liOgidMIJJ2S6T7qpEjmIhTYwN998cxKsSyz6LLt37w54Kk1NmzZ1j9Hkj4ceegjIfUtEqujQoQMzZswAvM9JofrrrrsO8L6HQUTT1tUuEAttVi699FLAUwg68sgjefDBBwFPbERzIE899VRfzjtdFyQaEWsqkJAwRSquizZP0DAMwzBikCc8QXkhfkqIRaJCHXkRM2fOzOAZgVdIoN1b0BLyiUTvUZ6T5iT++uuvvtmUCNQW8uijjwKZ9TjBCzsp7LNw4cIUWZdz5PmpDD8WH3zwAeC9n6CiVo4pU6a4jf0qtpPEWl4X3NAUCrWDXHbZZUDGmZ5qqNecPf3coEEDvv7665TZqkiKCsWy0+rWvn17wCugSybmCRqGYRhGDHyfIhEPealBm5ulnEqsHKVknjp16gSkR2NrrLlt+QEVdkmmSUVBJUuWdI9Rni0oouHRqGikT58+lC1bNuYxEoBv2LBh4EQNopEHKFGJggULuqIMknYLahFMTmjWrJk7j1P55lhtZRINl0SgPMG3336bypUrp8JUGjVq5H4e0VMfFCl5/fXX3VqBoGGeoGEYhpG2+OIJ1q5dO1O5rErWJY/zyCOPuJVPatTWraSG7r33XrcyLFFzyuKhMuRYlYDz5s0DvJlv2fEAVU2pHOOQIUPc30WjBtlBgwa5FWBBoEiRInTo0AHwWmGyO1oqr6AdeCzJJuVi/ve//6XUpqw46aSTAE+QXjPpJGUYiRrhVQmaEwGLVKPvmDyeSPEJ3ZcfPEBx0003uc3lK1as2O/x+v7pO6r2slQwadKkTB6gBEAki9moUSPzBA3DMAwjaCTVE1TvSs2aNQFcseFevXplGvCp/iR5PJFTvNWYrWnMet6SJUu6QtPJ9gQbNWrk2hGr8klyTpKIi85jDh48OJN0miZEZ+X9RSIh7iB5gRDuETzuuOMAXM89t2NzgoY+S+UEY+XU9H3V99dv9N3UENxIJJStvqz7778fSI00VW6RB6sBtaoE1fnVp08fvv32W3+MSyJly5Z1z6spU6YA3lBlVfCCJ2IhLyteT2uiUdTrqKOOckUnJE8nSUh5pr169UqZXTklKYtg+fLlAU95Xi0OsVDrgIpgvvrqK8BTi98fyZ4PJr7//nveffddwFvUI9GHrOKJ6JBg1apVD6h4IqjTNSKnDsycOdNHS8KcdtppbktDNJr5F9m6sXnzZsC7IMyfzQAAIABJREFUeGhz1r9/f7eEOzublKBwzz33ZHmfJrfMnj0b8H/e4/4oW7as23Ki779Ctv/9738BmD9/vj/GJZlly5a5CjFq+VDLxO+//+4ep+KX6A222mGSgWYm6rtWoEABtwAmqzTQeeed5/5fG5igFDxaONQwDMNIW5LiCUq/LisPcPbs2a7clHZ6yZgTlUi2bdvmFhuoRD7WBAIV/EQX/uQUyVX9+eefgDebMWhETr9esmSJj5aEKVGihCtFFU1kGEkovKxQZ/Tsy3g899xzriye38iTVRQmFpIf1K2U+rdv357p2GeeeQbwimcUqUllC8UxxxzjeoCSANO1JWhpgUQzcOBAVwNVIg2KUlStWjXLx8nbnzhxYtJs0/UvcuahCmGEziN57NISBe9754dOaCzMEzQMwzDSlqTIplWrVg2AWbNmAd6kchWWSAA2r3L22WcD3o46pxJTUre/8MILAWLKGym2LjX/oKFZkUuXLnW9+Vgq96mmY8eOSctNqoBEn0mLFi2yVb6eCjp27Ah4RRQqHksUKsG/7bbb3Nx4smnYsKH73Zo7dy6QMbeU39FnqGIg5bRjiW3r89E1Nt51/UBRi5gKq8A7N+ShqzUn0mtV/lbf1VS2UplsmmEYhmHEILAC2nkB7dA6dOjger8q5RYSK47MR2nUyyeffJICK5ND69atgbBIgOTjbrzxRh8tCnPcccfRp08fwGtBya1XpMpjVb7ddNNNQLBbCpSDkdegnFp0M/P+UOO58lCKXtSoUSNDLijZyJPQ3zy/CTHkReTlyTuPl4cWO3bs4IILLgDCEmqpxjxBwzAMw4iBeYJGrpDUVvfu3d2+yaCJhctzUxN/PDQUedmyZe7v1E/n95TuA+HEE08EvBwueL2dsWSs7rjjDsDr+ZRohUSqO3ToENhKZSO1NGjQAAh7hBr8G42mxg8fPjxlueRYxPMEbRE0coUWwZNOOolTTjnFZ2sMwzCyxsKhhmEYhhED8wQNwzCMfI15goZhGIYRA1sEDcMwjLTFFkHDMAwjbbFF0DAMw0hbbBE0DMMw0hZbBA3DMIy0xRZBwzAMI22xRdAwDMNIW2wRNAzDMNIWWwQNwzCMtCWwi+CQIUMIhUKEQiHWrl3L2rVrKVasmDvDzzAMwzAOlMAugoZhGIaRbAr5bYAoWLAg4M0z69evH2+99RbgTWCvUaMGAJ9//rkPFhpG3qZ9+/YA3HLLLQC8/fbbAHz66aeANyncMNKJwCyCF1xwAQB33nknAA888AC33nqrnyYZRr7i7LPPBqBly5YAtGrVCoD58+cDsHDhQv7++29/jEsAZ5xxBgCvv/46AOvWrQPgqquuchf6jRs3+mOcQYEC4cCjrvF33XUXAIMGDQLC13xf7PLlVQ3DMAwjCKj4JNY/IJTsf02aNAk1adIktHHjxtDGjRtDn376aejTTz8NFSlSJOmvbf8O/F/jxo1DYs+ePTH/3X777b7bme7/zjzzzNCWLVtCW7ZscT+XvXv3hvbu3ev+XKlSJd/tPJB/Z5xxRuiMM84I7dq1K9O/l156KfTSSy/5bmO6/qtevXpowoQJoQkTJmS6Prz22muh1157LamvH2+dM0/QMAzDSFt8zwleffXVAJQuXRqAESNGALBz507fbDKyz1133cXevXsB3NtYx5QrVw6AmTNnAvDBBx+kxsAkc8QRRwDQt29fAJo3b57hdtGiRZx00kn+GAfUrFkTgEmTJnHooYfGPGbSpEkA/PbbbymzK1GUKlWKxx9/HPBynbFQTtDwhxEjRnDuuedm+N2uXbsAeOONN/wwycU8QcMwDCNt8c0TVGXapZdeCsC0adMA/yqEkk2DBg1cj+CJJ57IcJ/jOADMmTOHRx99FIA333wztQZmk2rVqgFhWwEqVKiQrcdde+21AKxcuRLIm55g165dAWjatCkQ9vbk8WXFzz//nHS74nHDDTcAUKJEiUz3zZs3D4B7770XgN27d6fOsARx7LHH0qRJEwAqVqwIxI5I3H333QB89dVXAMyaNSvpthUtWhTAjYKIk08+2T3na9euvd/nkcf0wgsvAF7V6549exJma7I46qijgPDnFI2qQqOvh6nGPEHDMAwjbfHNE6xfvz7gNcn/8ssvfpmSFM4//3wAunTpAoR7tLQz3Fd566Kf27Zt6+6YbrvtNgCeffbZlNibXQoXLgx4wgX5lSOOOMLN8+mzVP4vFosWLQJg5MiRAEyfPj3JFsZn2LBhAHTr1i3LY9q2bZsqc5JG6dKls8x1xmLs2LGA5y2+9tprSbEL4LnnngO8CMKBct999wHw6quvAnDFFVfwxx9/JOS5E0358uUBL99XvXr1TMcEJU/r2yJ41llnAbB+/XoAnnnmGb9MSSgXX3wxAA8//DAAZcuWBWDixIluCPGggw4C4MEHHwQyhkv05alSpUpqDM4hCivFo2fPngCceOKJAPTq1SupNiWDESNGuIufePHFFwGYMWOG+zu/F7usaNSoEeB9/8C78I8ePdoXmxKB0gUKr0eiZux46Pw68sgjE2tYDIoUKQLAZ599BuS+2E+OgsK+55xzDgDNmjVj9uzZB2pmUihZsiQQe/H76KOPAC814jcWDjUMwzDSFl88wdKlS3P88ccD8PTTTwOwYsUKP0xJON27dwe8Hbh0GkeNGuXuBOUJ1qtXD4DevXsDcOihh7q79a1bt6bM5nhES1HFQmEaySEJFWMUKFDA3aWrICCoqEWnWbNmruf3yCOPAF7IM8gcc8wxABx99NGZ7lN4rl+/fqk0KaEodZBVOw54HvuCBQuAcOvEeeedl+GYTp06ATB16lQAfv/994TbqqiQClt0m1MKFQpfplXI1KJFCwBat24dWE8w+u8NngeoFNGvv/6aUpuywjxBwzAMI23xxRO86aabOPzwwwH/S8gTTfHixTP8rLLzXbt2uQLG2s2qZSIysa8yaBVYBIV4O+9oD1DE2rVHFwUFBRUvyEtatGgR/fv3B+Cnn37yza6cctVVVwGxW1eWLVuWanMOmMqVKwNeI/wll1yS4f5Nmza5XtySJUsA6NOnDwA7duwAcK81kej5lLtKhie4ffv2hDyPrg/yAEVQ89HgRbnE6tWr3Rx7Vh5gqVKl3DoJ1YqoMHDt2rXJMtU8QcMwDCN98cUT7Ny5s/t/lfvmF6J3OfKS+vfvz7Zt24CsG2RfeeUVdxcbFPZXDarG3UiU84ysTAw6aocQixYtchvhVambF3KCzZo189uEhNGwYUM3MqIcZ3REYuLEiVnmOHWe2Ui21KHq29NOOy3D78eOHcuGDRtiPkZVr2PGjHFrRYRqLI477jgA/vrrr0SaC5gnaBiGYaQxKfUEDzvsMCAc4/7www8BstwdxKNq1aoA/Pjjj4kzLkGo0lM7osaNGwOxZauimTNnTsLyCIlCDa3aiUWjHFQk1113HZA3duDKq0QKXkNYvGH48OEAfPzxx4CXvw5ijrBOnToAVKpUCYhdhavmeH0XdYze57///pt0O3NCy5YtqVu3bq4fryjF2LFjs+xVVaQjOtcYJDp27Oi3CdnmyiuvBLzvoa5nn3zySaZjNTThjjvuAMjkBYInUKGey2SQ0kVQX+jKlSu7hR/x9O+kBahwR6lSpQBPBWPnzp2MGTMGgMGDByfF5pwi5Rsl3qUS4ziOa3e0Vt6ECROA4KnDgNfoHh2GUhhbxQiRxFoYg0Tz5s3dVggtfiqCUXn9Tz/95BZl6PunY4KIwoW6+MQqQOrQoUOGWy2CauNZsmSJW9qejLBTdlEhSL169dzWGt1+/fXXALRr1w7wCijiEdmiE/08AwYMSKDlyUGLhdDm/7vvvnN/J/ERXW9ioQkuyUTawuKHH34AvHaVSCR8cOaZZ7q/0yZ0+fLlgDdlKJlYONQwDMNIW3yTTdtfc3znzp3d3bp25Gr2nTx5MhD2OFRkExRPUCi0pNt69eplSuDPnz8fCLeMQO6baZPFG2+8kUmKSrvPyOKmaORhRD5WknHy3P0kcvqDvDvJ3KUj8rxatmzpevESsUhG68D+UIiyZ8+emSIQTz75JJA9D1BeceTzyANUI3t2nieVqLWgaNGibvoo2ltVoVZk20CxYsUA79zbtm0bS5cuBeCll15KrtF4RXCKMohIbzUapbXEhg0buOiii4CwEACYJ2gYhmEYScU3TzArRX7lJCZPnsyaNWsAOPXUUwHcYhqVPl911VXuVOygot1O3759Xbvl8WmHFzQleM16rFOnTqap8fGa3SWHpB1s5C7e75lhkTz88MMZcn9ZES2gHWRUBLJp0ybAy59HojyL8jSSDotk6NChALRp0wbA3Zlv3LgxwRZnTawCCSGxeU0zyWn0RCX3n3/+ee6MOwDkcbdq1cqds6e/s6JdqpvITiFIKBRyaxDmzp0LeFMbVq9ezapVqxJofXz0eUTPTtSc2EhOOOEEwPucVRzZpUsXN995xRVXJM3WaMwTNAzDMNIW3zxBeXlC3oM8u8mTJ7tTsbds2QJ4uyM10ELs6sQgodj2lVde6XpRyl+uXr3aJ6vio5mG0TH7rNAOV7JwkqISPXv2TOrcttywvzaHfv36udEK5QuD2BohVIL+/fffA94Yq0hUoSdB8AceeACAcePGAdCgQQP3WDU7y1t86qmnkmF2TCQKHS0TBnDKKacAseXOVJmovFSkWIOqmfV3SeV1Q3Zoqn20t5QV+r5FR80UcXnllVcSZWLSUHX59OnT3b+DpNF03Xj88ceBcGWo6iPUQK9r5D///JM0G80TNAzDMNKWlHqCkhTbvHmz29eiqkGNDpKXN3PmTNcD1CgRxb21m7v99tvdCsug0bBhQwDGjx8PhOP3qmzTDjwvEkvmTlPMVXEXTaor8CSGnRuBYVXw9u3b192JB03MPBbykNQ0HwsNotV4LDWIxxtv1aNHDyC1nmCsPlOJNlx++eVA7KpVVVbGqvSV/brepBJViG/evBkIe4LyhvQ+Jk6cCGTsm5anpN459U3H6rnzG4mVKxJRq1YtwBsS0LZtW/carxoPIS9vyJAhbrW2hCnUQ6hcdzJI6SKoRO2vv/7qamRq6rIaxTWVvHjx4u7JqsVDYYHbbrsNgPvvvz9FlmcffVFlm6ZCf/bZZzGnYec1oi+G99xzT64a6pOJkvHSA9UiFmtRVBtOdPvKiy++mKemSChs9OWXXwKeCEAsatasCXgh1FgTQqT0oVBpKtH3RRdS8MJjAwcOBLz5lh06dHC/f2rJiX4/o0aN8mXxE9rM6/tYunRpt3E9XphPBXO//fYb4F1bgogWqQ8++ADwPjvpCL/wwgtZCqNIMSYStVLFa7FIFBYONQzDMNIWJ17Ju+M4SRn+du6557oFMAp1arcjChcu7Opv/v3334BXUDJ69GggeFqH4Hm2CoeqHL1r167ufUHn+uuvB8IFFNG7a3nqCkn36tUr0zFTpkwB/NNjVJm1Igfy5KQB2qxZs0zFBgo5SdouL3h/sZC3MGPGDCD+VImsPCeAl19+GfCKMFKJtExVGAfxbc3qGLV1dOvWLZAhxOyipnddU1RYE7TWKvCK6eR5qxUku2jqjsLFmsd6oIRCoSxj/uYJ/r+9Mw+Uct7/+CttKolStCdKUUnJcitySZHSIkUqS3TrEq2E5CblWtIhhCtUWqRFiZAtskTl2koXJSV1aCER6fz+mN/7+8zMmXM6y8w8z5z5vP6ZeuaZme9znuX7fX9WwzAMI23xJUXi+eefdzO+ErOjy+389NNPzv8k5acQ46BRvHhxLrnkEsBLL5CjWE76VFGBELsjvP796KOPRuwba5/Ro0cneoi5ojBr+WDkH5P6mzNnjkuWl2JKhV6BeUFBSCo6MWDAAAAGDx7sSmvlxooVKwCvG4of5JYsnxuyGCmBXInxsToYpBLyKeYWwBQUZIVp164d4Kn5jh07Ol+00D0oP+KsWbNc8FBujRXijSlBwzAMI23xxSdY1GjWrJlLXFUS76233gp4KjaV6N27NwCTJk1y6iE3X4yifhXFq8iuoBUET2fKly/vIrLVhmjv3r3uPQhFXCoa1E9/k3xes2bNci3JcvMJKgpZKVTR1opURxYNWZWC7BPMiaOPPtqdn7p16wJeIZFk+GvNJ2gYhmEYMTAlWAhU7Pbtt992Zd8U2abSVCoOm4r079/flTTKTQmqeK5hxJPatWu7nM8WLVoA2aOUt2zZ4vIK/Wj7lAyKghIEL7dT1iWV5svt2RIvclOCNgkWAJk8Va2mcePG7t99+/YFUjfEPhqZdRXsom4F4VU9/ExENoyijhbYqoijVJ9kBo+kOmYONQzDMIwYmBLMB0oEXbp0KQBVqlQBYOrUqS4sP4gJ/IZhGOmMKUHDMAzDiIEpwTxSokQJ1xFCfQ1V9Fuhv4ZhGEbwMCVoGIZhGDEwJWgYhmEUaUwJGoZhGEYMbBI0DMMw0habBA3DMIy0xSZBwzAMI22xSdAwDMNIW2wSNAzDMNIWmwQNwzCMtMUmQcMwDCNtsUnQMAzDSFtsEjQMwzDSFpsEDcMwjLTFJsEAMWzYMHbv3s3u3bsZMmQIQ4YM8XtIhmEYRRqbBA3DMIy0pYTfA4hm4cKFAHTs2JGBAwcC8Mgjj/g5pLhwwgknUKJE5J/7l19+AeCbb75x28qUKQPAZZddBsCECROSNELDMPzmoINCuuThhx+mefPmAGzbtg2AtWvXAjB37lwA1qxZw44dO3wYZdHClKBhGIaRtgRGCZYqVQqAsmXLArB//36uv/56AJ555hkAfv75Z38GVwAqVaoEQEZGBgBdu3Z1HenVw3H79u0AnHrqqdk+v2HDhiSM0oimdu3aAEydOpXWrVsD3vkqVizUkmzNmjUAtGnThszMzKSPcdq0aQB8++23AMyfPz9fn9+4cSOAL2M3cmf//v0ATJw4kbFjxwLQoUMHAPf86NWrFxBSjaeffjoAX3/9dbKHmhBuv/32iP+PHj36gJ8566yzAHjzzTcL9JumBA3DMIy0JTCd5Q899FAAli5dCuDs4QDVqlUDYOvWrckaToFp1aoV4K1o2rRp496Tkoj+m5900klASPEeddRRABx33HEAgbX561iqVq1K9+7dAbjooosAOOaYYwA47bTTAE95BJmGDRsCuNX3hRdemO18Rf//1Vdf5bzzzkv2UPnwww8BaNasmRvPgcZarFgx9+/vvvsOgB9//BGAPn36AJ7CTQXKlSsHwLx58zj33HMBT0UJPS+mTJnitj3++OOAp6L95vDDDwc8NSML0vHHH891110HwN69ewHo1KkT4FmJXn/9dX799Ve3f6qiZ2VeVF9u6JqPhXWWNwzDMIwYBMYnqFXcX3/95bbdcccdQHDVUDjnn38+AM8++ywABx98cMT7jzzyiFupnHHGGQCcc845AIwYMQKABg0auNV50I65Ro0aQEghAfTs2ROAli1bZtt3z549Ea9BRlG49913H+ApjNWrVzvVMG/ePABOPvlkABYvXgz4p5xatGgBQP/+/YGQ1UD+y7ygY5SSlI9RxxdEFDWte2fOnDlAKIZAz4wffvgBgOLFiwNQpUoVAG666Sb3PbJSXHLJJUkYdWxKlizpVM+AAQMATxHGQse+YMECAHr37g3Arl27qF+/PuAdVyr5BmUly68ClO/vrbfeiss4AjMJ1qpVC4BTTjkl27Y//vjDlzHlBz0YNZl//PHHALRv3x4IBSHopP/+++8APPjgg4B3MRQvXpwlS5Yka8gHpEmTJowcORKALl26AF4Ak0wykyZNcqkf//jHPwB45ZVXAM/cFmQ0+SlIRMcwf/58KleuDISCmsALSNCkOG7cuKSONZpHH320QJ+Tq+GDDz4A4IsvvojbmOKNFl933303ABdffHHE+zt27GDw4MEATJ8+HfBcKzKzDRo0KGJ/v5kyZYpbRCrt4bbbbgPgueeeA6B8+fLOvaBJXPdexYoVgdBCW+ljqTT5iTfeeCPi/2+++Wa2iU0TXkGDXvKCmUMNwzCMtCUwSjDVkQJU8MHKlSsB2L17t9tHqxkpp2jVt3jxYq688spEDzVH5JxXIMGRRx7pzLoyDWq1rePbs2cPTZs2BTwV9emnnyZv0AXklltuAXBqT+ouPN1AwTJHHHEEgDM5SoGlgtKNxUcffQR416zMbEHjkEMOcWPTNaa0Ip2vSZMm8dlnn0V8rm7dugB069bNbdO9dvPNNyd20Llw4oknAiFXgsy5l156acx9t27dyl133QXA5MmTAahevToADzzwABCykEkhBxVZudq0aZMt/aGwqQ3xwpSgYRiGkbYERgnKrp+q3HvvvQAMHToUwCk6hS5fe+21LjDmqaeeivjsli1bAM8v4BdSPPJn/vrrr85HoXJ20WHosfjtt98SNML40blzZyB7uko4y5YtA3B+Ue0rP06qIfWrc3jnnXcCnqoKCocccggQUkBSgFLdSsd5++23s31OPjMF1Ek5AU5V7dy5M0GjPjBKdcjKynLHJf9lboVAFBOhIBpZJAYMGOCeHUFD/r7wFLFoJei3AhSmBA3DMIy0JTBKsGTJktm2KRE0Fbj11luBUJoDeKWOlDC+ZMkSpwSVEKtVnNIOpMD8Qn4KveaV6ChJ+TCCjBLEP//8c8ALvdfY58+fzzXXXAPgkrGVZK4yfqnE2LFjnaJdtWoV4PmWgka9evUALw0HoF+/fkBsBdioUSPA81srhSRoyIJw//33M3z4cAC++uorAEaNGgXAk08+CURGxE+cOBGAdu3aAV6Ucn7v02QQ7gMM51//+lfyB5NHTAkahmEYaUtgyqbJT6ZEUPBWeKlUzqlChQqAF4UmHyF4ZX3UGkUlt/xWgIVl9erVgBf9pjwmP/0veUU+T/kIw8uNRZce03EGVWnEQn7AMWPGuONQab6gRrcqkX3atGkuAV7lCKMLy1911VWu1F358uUBL2m+Tp06QOi8KRIxPFrbL4oXL+4ij5Xnp6ITyrG9/PLLnQ9QLeVkVQpyTmD0fCK/n/7+fpFb2TTfzaHRtfPE9u3bU6LiSDS7du0C4P333wci69mpV5hu0lSf/KJR6L36JKYCSkhWQryKAjRo0MBVVlEdV5nbUgEFmo0ZMwYImXK1wAzq5Ce0IAGviEH080F1NWvUqOEqqmjRLNOiJsF33303EJOf+Ouvv1xah4Jcrr76agDuueceAL7//nu3vypLBXnyywmZRd944w2XCJ+MBPj8YOZQwzAMI23xXQmqR5bKI4nly5cHptJ7flBJKpVEk3kgMzPTqcKaNWsC3so1Otk3lahRo4Yz7SiNIrz+a6qgNIHwdAEdR24ug6Ai9aqxZ2VlufOkV7kZpAyD4nbQvdOtWzcaN24M5KzCP/roI9ddQYF0Oj6pyFQI1NLxKfhFlgnw0qxef/315A8snygAJroeaJs2bbLVCtW+0akTycaUoGEYhpG2+K4EiwrqCfj8888DXvCB0iA6duzoSiQNGTIECCXQg1duLBW58MILXZKyVvBFga5duzrlLqX02GOP+TmkfDFjxgzA6+TRunXrbJ0yYvUeVDK6nwn033zzDQDjx493fsxoS5HSO0aNGuUUoNSTivDL5x4UhRsLxQlIrcon3a9fP9fhRMe6b9++iH2DiFRd9CtkV4f6v98+QlOChmEYRtrie4qEFNPmzZsjti9atCgiSizoqIt1Tonwq1atcsVupQTlC1TkWxDavOSXGTNmuPY2KlycCp3kc0L+pBUrVlC2bFnAS2VR+HqqokIOtWvXjvi/IhQ7d+6ckmkgSo2ITslRknl4mlLQUOK7ejqqlN2oUaPccb333nuAd5x9+/YFghNdmVeifYLRCfW5dYYvLNZZ3jAMwzBiYD7BODBlyhRXfFrKWv4+laiKhVZ20V3oU4mqVavyySefAKmtAIVUUtmyZd25S3UFKFS2S68vv/wyAF9++SXg+aNSjbZt2wLevaeEeiWiBxE9L6L96OF+Z+XbXnXVVQC89tprgFcgvEuXLoHP+QxHyjW6tJrfijawk2BuVdWDgpz2ffv2dQ7ujIwMwAuQCUcBCZL9ctwHtRJ8OiEz6NNPPw2EHqh+d45PFuEdNVLxmKO7r6iCjOpyBhHVoz3ssMMALxFebpVwPvjgAwBuuOEGwJsoe/Xq5Z43qYAmvTPPPNPfgURh5lDDMAwjbfFdCVapUiXmdnXvDjIKmMjKynJV31999dWY+55wwgkuFUJmQ5lMUxH1QTv11FN55513fB5N4VF4vTrNZ2ZmRnSZL4qoS4Ze16xZk3LHfNRRR7mEet2D6j4fZNQpQ+h5F949IppFixZF7KM6xX5z++23O3WnBPhoE2ebNm1cj8FoVE7NL0wJGoZhGGmL70rw8ssv93sIBUZ+JPCq0//555+AZ/NXyaNwX4vC0FPZF9ixY0cgFNSTyknyUn7qV6fgilT0jeUVHbOKNuuY586d69uYCsoVV1zh/i0fu8r3pRIffvjhAfepVq0aACVK+P7YBrxE+PAk+Oi0h9yQWrSyaYZhGIbhE8FYUqQoX3zxBQCNGzd2UV4vvvhijvvL7q8eb6mMWhCB13E9FVEJqlq1agFedG8qRd3lFx1zs2bNAK/EWnSUZZBRm6S+ffvy22+/AaGC26mCirPLv6dScbGoX78+ACNGjAC8Umt+W5Kiy6DllaAUzhamBA3DMIy0JXBKcOXKlRGvQWbChAkAbNq0iWHDhkW8pwazKr3VqVOnbF2xiwI///yza1mTSigaVDlyKrJclH2BKs2lY1aRbOW7phKKYj322GNdAe3wRrRBR/moag4sC9JTTz3l9lFC/Y033gh4xdCVB+l3k+dYbZO0TdGiivwMiuqLhe+1Q3v06AF4JhlNLMOHD0/0TxuFQJNG6dKlXc3QVKFcuXKsWLEC8AKX9GBRV/JUR53lW7ZsCYSqxIwcORLwTL2a8FOh6sjRRx8WLpj2AAAgAElEQVQNwMyZMwGva0vx4sVdcN306dN9GVthUACdnn8VK1bMto8mfJ2/devWJWl0RQerHWoYhmEYMfDdHDp79uyIVyPYqCK/OhCkQlGDaBo0aOA6r8tc47dpKd4sW7YM8ExpmZmZKd0N4+STT454FQ888EBKKkChcyHTp5F8TAkahmEYaYvvStBILdQvUTz33HM+jaTgrFy5MjAJx4lCgWXq11nUUOBZUfHhGv5hStAwDMNIW3yPDjUMwzCMRGLRoYZhGIYRA5sEDcMwjLTFJkHDMAwjbbFJ0DAMw0hbbBI0DMMw0habBA3DMIy0xSZBwzAMI22xSdAwDMNIW2wSNAzDMNIWmwQNwzCMtMUmQcMwDCNtKdql9I1Cc8ghhwDQqFEjAC666CIAfv75ZyDU4btq1aoATJ48GYCpU6cCsH///qSO1TAMI7+YEjQMwzDSFl+6SDRr1ownn3wSgCZNmgAwd+5cAK666ioAdu3alYifThpnnnkmAM2bNwfg1ltvpUKFCjH3Peig0FqkRo0abN68OTkDzIVjjjkGgDvuuIP27dsDcNhhhwHw+++/A7Bv3z4AypUrx969ewE4+OCDAWjbti0Ar732WvIGHSd0PbZs2ZKHHnoo132LFSvGzp07Afjb3/4GwJo1axI7QMMw8o11kTAMwzCMGCRVCTZt2hSApUuXUrFiRQCnIkqWLAlAp06dAHjxxRfj+dMJ58orrwRg9OjRAE71lS9fHoAD/J0B2LJlC3/99RcATz/9NADTp08HYN26dQkYdWyWLFkChHx6X331FQA//fQTAO+99x4Aa9euBULH+dtvvwGwaNGiiPe6dOmStDEXlHr16gHeWP/xj38AUKdOnVzPWTRffPEF4FkyVqxYEc9hGkbKU7NmTQCuvvrqiO19+/YFoFatWm7boEGDAHjssccAuO6664DQ8/Xzzz8H4IwzzgDgjz/+OOBvmxI0DMMwjBgkVQlqNs/IyGDHjh1AyD8IMGLECAB69eoFwIknnsi3334bz5/PF9WqVQPgxx9/BHJfbZx99tnOp6loSiGVlxclGGufYcOGATBx4sS8Dr3QaEW2cePGfH1u9erVABx33HEAHHXUUYAXSRokDj30UABeeeUVAFq0aBHxfrFixfKlBMWNN94IwL333lvIER4YXTfXXnvtAffJysriiCOOAEL+aYDBgwe79wAWLlzo6z0XL44//ngAbr75ZgB69uzJ2WefDcBbb73l27jCOemkkwC44IILAE/56BxlZWW58zRu3LgDfp8sTiNHjgRCvu0777wT8Kw3fqA4gR49erjzceyxxxbqO//8808AKlWqBMCvv/56wM+YEjQMwzCMGPiWJ/if//wHwK08t27dCngr9EsvvZTx48f7MzigQ4cOgOfn+uGHH3Lc98svv+Tf//43ALfffjsAe/bsAbzj2blzJxs2bAA8P1/Hjh0BaNOmTbbv/PrrrwFYvHhxIY6iYORXAZ566qmAl0s4b948AH755Zf4DqyQKMJ14sSJbgV++OGH5/nz8l/LiiGlmywqV64MwHnnnQfALbfcAnjRvLGIZWXQv++///6I/48YMYJp06YB3jUqX2dQOfLIIylXrhwA33zzDeD5Y8uWLQvAkCFDDqgAu3fvzpw5cxI4Uo9p06bRo0cPAIoXLx7xXnhu7ZgxYwBYvnw5EFvF6pp+6aWXADjllFPce9rfDyVYvXp1wLO0NGjQwL2n54IyBPRcbNiwYTZ/YTTLly93CjkvCjAvWLJ8Djz++ON53nfTpk089dRTAGzbtg3ABZTI7LFs2TJnDs7NVKYLQg/p//3vf/kad7IpV66cO3YFz/zzn/8EcjcB+0G3bt0A6N27d4E+v379egAmTJgAeE77ZKFrUtdGvKlatapzS/Ts2RPwUpe0uIvXg6ewlCgRenTNnDnTLQJq164N4CZFPYCfeOKJA36fJsxEcPLJJwOea+Oiiy5yixMFkWnRLfdLvXr1aNWqFQDvvPNOjt993333AZGTH4SOPSMjI16HkGdymvzWrVvHAw88AHhBjxJApUuXBnDvx0LBd+PHj2fZsmVxHbOZQw3DMIy0JbBKMDcTTxDZsmUL4K06lSyvRPilS5ceMFn+0UcfdeanoCtAOfCfffZZd64UfKDVbFBQWTeFYueGghC++eYbVyhApeLuuusuAEqVKpWIYR6Qhg0b5vje2LFjgYL97WVe0jkFT1UpeGbp0qUAvPzyy/n+/niiv73MtW3atHHmPrkVZIGQOdFvs7yCpbp27QrAwIEDee655wDPxL579+6Iz6xevdoFmuVEr169uPTSSyO2bd++HQil/OQldSDeKPhFClBurg4dOjhzdTRKdejXr1+O36uASZl944kpQcMwDCNtSaoSlF0XPDv5xRdfDHirbXHuuec65ZQKJdQUrqvV3j333ANEJstH+8g+/vhjAKZMmQKElKCS5YOGgkD69OkDeD6jpk2buhWnEs5PPPFEAGbMmAF4q1O/UHCRxgVeAILG9vDDDwNw9913A6Fr9fnnnwdg1KhRgBespDSY9u3bu/OdDPT3HT58OBDp29Q4FGyVn1SHSZMmASE/VFDLvpUpUwbwgoH0vFi1apU79ksuuSTiM/n5GyTyGtU5Ec8995zzn+cHBUbNmjULgNNOO80pYwXiqWiHyhv6jZ75mZmZ2d6rUaMGgAsSioUCE2WJSASmBA3DMIy0JalKUC12WrZs6fwzZ511FgDvvvsugAtT7t69u1vt5SW6y28efPBBwFO2sXjhhRcAXOqHUhHkTwwi8m0qAlS+onC0Gr3hhhsitivcOVyBJROlbtStWzfbe1r5H3nkkTl+XvtEqwSFpefk400U8hdfc801gBdZ2KtXL5cgLnWoVIfCJr9rBe+Xn1fXlqwlUg0qIzho0CAXaSkf4HfffQdk97PlhhRHIvjvf/8LeNav7t27u7KI4daxaNq1awd4RfgHDhwIeD7ucG677TYgsceRF5599lnAO0916tQBQs9+PfNVPEPnVLEE4cjPq+s5kVHJpgQNwzCMtCWpSlC+o/79+zvlJKSGpAy7d+/OZZddBngKUuVygkheVIFW1VqpBlkBCq3AFKmmc6FIL/nNwlHEmpKxR40axR133JHwsUajElLyy4YjH2BBOOecc4DYK9hkoPtABRpmzZrloiWlHqQE80P//v2zbVM03sqVKws01sJy7rnnAtn9RkqIr169Op07dwY8X63eU5urcFTEX4XTlYOXyHtRxQ3km37ooYcYOnQo4LUki4XKF6r0WCwUrStV5TdK0Fc0p3ICO3Xq5KxJimSOdV/qnClKOxmRvb70E8wNOcDnzp3rQtT1qgTMICITRfTNqvBfdccA78Tq5g1KPcN4I9NMq1at8lWZJV4sWLAAyB6YMG7cOHeT5WaOygl1ioiVLJ/M2qHxQqbURYsWOXP322+/DXjXqF/BaaqoogpTeUlz+f7774FI06DM+qpbqWR7XRvJqMykFJSMjAyXQqBJOZply5a5CUEdd5Q8LzZs2OBMpbEmfD/Rc1wFKiZMmOCCB3Ni586dzvwZ7y5CVjvUMAzDMGIQuGR5rcyfeOIJ/v73vwNe1+4gK0GZU6K7Pej/t912mzM3Kd3g9ddfB7yQ98ceeyxfzvygo4Cm6BVsopEaC1ff4JW0e//99wukAIWOq2XLllx++eUR7ylII5WQYlYQA4RMduB/epJShtTnUWNVkEn16tWznQOlv5x//vnZvk/mTwUZqbxhMlBwUa9evZwqjBVoBqGgJ7kiWrZsCZCtXFhGRkbgFKDQ/SUz/a5du9y5i0a1ePv27etLH1lTgoZhGEbaEjglKObOnetK8Fx44YUArj9WkANkcmLMmDEuyEBlqlQ0V4n1s2fPLlJKUJQsWdKtfJMRai8/d7S/e+HChUD8/D/79+/P9htBKxqeGyp8oI7fWVlZTiGprFdQUHkxBWLpdfTo0W4fBTvl1l8xJ9q1a5fUknC6D/JyP0Sn+KhowzPPPBP/gcUZFTOPLoYSjtLi/OiYA6YEDcMwjDQmsEoQvFI58plJTaRCakEsPvzwQ8BLIlfYubrYX3fdddx0003+DC4B6Hz9+eefgSuqnc6o/JbaJpUsWRIIRVVGR9IGFRU5GDlypLuvCqIAhdRVkFBkbnScgfy1BSm9lizUmkrpUkp3Ay+SVekhaqXkF6YEDcMwjLQl0EowmiZNmgCpqwSFIt6i/X8qr1RUUC5eMmnSpImLJIw3WrGqOXJ4Gxv5e7VKDzIqQq3WTPJjTp06tdBl1pKF8m9LlSrlGv8WhmRGieYVNeFVmT6Vips5c6ZvY8orKlShuA7wInv1ngq3qxGvX6TUJKgqD373NCss6sCg4xGdOnXy9QJXArH65o0cOTJfQUhKbFY1IJlDk1kt5pNPPmHy5MmAV6M1XmjyU6WWcFQNqTCpF4lGSf46DvWxXL9+PZAagRaqCqOFzubNm10lkqJE+fLlnUlRHSIUQKdUnyCijiuqcyqGDx/uCh6odmhQMHOoYRiGkbaklBL0Gzl3VQFeK+kaNWqwefPmiH1VpklljbKystwKPDpBVnU5E2XGyytKah8yZAgQMpeps7hMMbFQCLfKiKnIwaeffgqQrU5sqjFo0CDA61IQjkrgBd0MWrlyZdd9QuZPJZWrD19QewmGo2NQjeEBAwYEWhkVlM6dO7vuK8uXLweCUx80FjJP63mhWsoqXff444+7e0WBWSo1qTJ3fmFK0DAMw0hbfFGCbdu25ZFHHgE8+7bs+rGKEgeB5s2buzFHr6QvuOAC13NOBWC1MgrvLC+iE6ofeOABwP8SVR999BEAmzZtAkJlp1RcWc5spTpINdasWZPu3bsDnr9GAT4qfB6U9Agl7LZu3doFtaijRzRNmjRx6k4pLAqMUdfu3bt3u4Lpb775ZsLGHQ/OO+88Z5UQKh4gxR5kGjVqBHg9K5UsL6tMUUHnaMKECW5bPAJ/Ekm5cuXctaTnnSxj6iYR3g9QKR96tviVJC9MCRqGYRhpiy9K8JxzznFqSOG/KnmkcNlp06bF7KDsF+vWrWPWrFkAXHHFFRHv5dUfJJu4bPyKmvzggw/iNcxCoZQNlXN75pln3Ap8xowZMT9TrFgxdy5fe+01wEvC3rp1a0LHmxMKd//hhx8Ar2C5/BQVKlRwqjcnwo9L6PwpxSDofsBwNOZw1O9R0YdBRv5yXVNqVSVVXlSoUaMGEGqfpFJxKgYQVIYOHeoUoK4lRSKHK0A9N7t06QJ4vkBFjfqFL/0E69at68xQxx13HBBynAKceuqpQKiagELuZWpUXpNf5jWZxT7//HMADj30UCDSvKkcQJ1gBc+MGTPGNaJ94403kjPgQtKwYUOXzqHgEF3kq1atAkINXWXO0CShv4Hf5NRPMC/EmgSvv/56wMtvSgVUl7Fr165um8xr6qiwYcMGwDOPBgndc2vXrgU8d4ly6Ioaeg5eeeWV7r6K7oYSNO666y5X1UtjVocc0bVrV0455RTAq1CkmtAvvPBCwsdo/QQNwzAMIwaB6SwvxTR06FAglHSpQAut+sKdxX6i9Acpg06dOpGRkQF4fcrUc87wD3Xvfu+99wBPueeFYsWKMW3aNACXfC+zlGoepgJS5bHuc/U+VL/OoJjlw5HLoH79+oCX6F8U0yLAK/AwYsQI2rVrB3hBV0G97sKVYF7Qs1KfSYblyJSgYRiGYcQgMErQMIz4k5sSVPK10g6CFiBTqVIlZ1mRbzpV/OkFRX6yefPmuW2qwauSZEGjQoUKLm4jGm1/+OGHXY9KxVQoxSwZmBI0DMMwjBhY2TTDSDO0IlcJsqDy+OOPM3bsWKDoK0ChKNipU6e6FImgxxfs2rXLRfKnIqYEDcMwjLTFfIKGYRhGkcZ8goZhGIYRA5sEDcMwjLTFJkHDMAwjbbFJ0DAMw0hbbBI0DMMw0habBA3DMIy0xSZBwzAMI22xSdAwDMNIW2wSNAzDMNIWmwQNwzCMtMUmQcMwDCNtsUnQMPJI6dKlKV26NAsWLGDBggXs37+fDRs2sGHDBr+HZhRhatasSc2aNVm3bh3r1q0jKyuLrKwstm3bRuPGjWncuLHfQ0xpbBI0DMMw0hbfukicdtppAJx00kkADB06FIC6desCcNZZZ/HWW28l6ueNBKDO1//6178A+OyzzwAYP348M2fO9G1c8aJ58+YArFixwm3buHEjAEcffbQvYxItWrQA4MMPPyzU91x00UUA7t7LzMws3MCMAlO7dm0AXn75ZQDq1auXbZ+vv/4agPr16ydvYIWkcuXKQKhfJEDHjh3de6NHjwZwfSTjhXWRMAzDMIwYJFUJVqxYEYBJkyZx9tlnA3DEEUfE3HfHjh1s3rw55nu33norAMuXL2f79u3xHKKBtwqbM2cO//3vfw+4f+nSpQH48ccfAShbtmzE+8uWLeOss86K8yiTR4kSJQCYMWMGAN26dXPvffnllwAcf/zxyR/Y/9OuXTumT58OwLZt2wBo2rQpAH/++Weev6dnz5489dRTADz//PMA9OjRI44j9Q9Zmm6++WYAli5dCgT3+Nq2bcv8+fMBKFOmTMR77733HgCnn34669evB+DYY49N7gALwIABA4DQ9QrQoUOHHPcdNGgQAI888khcftuUoGEYhmHEoEQyf+zvf/87kLfVV8WKFTn88MNjvrdgwQIA5s+fz2WXXQbA77//HqdRJoaGDRty/fXXA9ClSxfAs42vWbMGCPnU5s2bl/Sx6e/cqVMnwFux9evXj5o1awJ5UxTRCrCoID9FuAIE+Oqrr9y59IP27dsDMHXqVGdl0WuxYjkufHNk79697N+/HwgpEYCnn34agL59+xZ6vMnmoINCa/ypU6fSs2dPwFPKd955p2/jyg35ljMyMrIpwI8//hjwzs3MmTNp1KhRcgeYT/72t78B8OSTT3LUUUcBcMghhwC4ay0Wd999NwAHH3wwELIeQv4sG3klKZPg6aefDsBjjz0W1+/t0qWLm1j+/e9/x/W748Utt9wCwE033eQmCZmg9XrccccBoQeOtskUkgz0+1OmTMn2XkEepkUNLQ6imTt3rjOHJpNy5coBXgBSpUqV3ANl8uTJAOzbty/f3zt//ny+//57wHsYK4CtfPnyAPzyyy+FGHlyadasGQCXXnqp26a/2SeffOLLmA7EtGnTAO+eBNi6dSsA1157LQC//fYbAFdccQUlS5ZM8gjzxplnngnArFmzgJzdXjmhZ6UmQ3H//ffHYXSRmDnUMAzDSFsSqgQPPfRQABYvXgxAhQoV8vzZxYsXOzOdlGQsbrzxRgAeeughAHbv3l2gsRYWmTalTEeOHAl4SmrNmjUu1FkqT+H1CrmvXLmyW70mUwkasZHjXqHq4ueffwY8E02yefDBBwE4+eST3bapU6cCcN1118X1txRwIVOwAmf85pprrgFCz4bhw4cDXmCWzG36m4Bn1p09e3Yyh5lnFLATfk6lAJW2ooAYsWPHjiSNLv/ITZBfBZgTOt+mBA3DMAwjjiRUCVaqVAnImwKUw3PChAlAaGUkJ7/C6+VTPOyww9zn9N1ygvtB5cqVefHFFwHPD6Fglz59+gCwdu1a9uzZA4SCZACuvvpqwPs7zZs3j/Hjxydv4EauSFXJH6bgq86dOwM4/1my0LV1wQUXRGzfvn27s4QUdZT6cd999wGh58Ztt90WsY9SWRo0aACEVOANN9wAwK5du5I11Dwhf7OOIdzH9+yzzwLw7rvvJn9gBUQFM3KzSOTlWR29j9T98ccfzxdffFGIEcb4rbh+m2EYhmGkEAlVgnfccUee973rrrsAuP322902JcLPnTsX8FbeZ5xxBhAK21YUlVbn4X6AZDF27Fi3StcqtHfv3jnu37p1a8DzAygi9JVXXnFqMSgosXXRokU+jyS5TJgwwR27Ii/feecdgKSX81O5NiV4y9cuBg4cyKpVq+LyWzrG6DJwSinwyyco648UnSJkX375Zb777jvAKyvWqlUrwFN9gwcPDpwCFDqX0VGey5cvd5HlqYSe37mlP4j87KP0in79+jFkyJCCDzAGpgQNwzCMtCUhSlDRnIpqioXydSZOnAiQJwWk6Ci9tmrVyilB+QvXrVsHwPvvv1+QoReIBg0aODWXmwIM3x+y5wv6kSgP3so5Flp1vfHGG4AXfavor1q1auWYR5eKyMrQo0cPd15+/fVXwLtWk41KDEYrwNWrVwPwwgsvxO23VChB7XnkgytevHjcfqMgSGHIx/7TTz8BRPgDX3/9dcBTjYrQ3rlzZ7KGmWdq1KgBeMo2mlWrVrnrLhWoXr16ru/v3LmTUqVKAZ5/Tyge5MEHH3TPceUZPvrooxH79unTx0XOv/3224UfOAmaBEeMGAFkl/hKtF21apVLzFa4eWHRbylcOrq6RyJp3bp1tpMVC9XkVBqF0icU2q3XZBMdaBGOJgUtLv766y/AC+ZR3dDcCGp1jnBOOeUUANftQuYX8NJwXnrppeQPDO96EbpOZE5X8nQ80Hf98ccfEdv1AKtTp05S+ycqNF6J4pr87r33XiDUNUOLMD2IVff0nnvuSdo484tM7eqiI3RuZ8+eTb9+/QC4/PLLY35HZmamSyaPTp9IJieeeKKrXRuNFiD9+/d3aWQPP/xwxD5K+dF9Bjl3xahQoULcK1OZOdQwDMNIWxKiBNUTMJply5YBOZehyi+zZ8/m/PPPj9h2zDHHxOW780NWVpYzD9apUwfInuzepUsXzj33XLd/OOPGjUv8IGMglaegntw48sgjC/w78VL7ieTKK68EoGrVqm6b0lykLPxCY9J1I1P/K6+8Epfvl+oNP3YFngilIi1ZssSZ8xNNt27dnNtEIfNSPLKi3HnnndmU8pgxYwAvqKJp06auy4cC1/wmJ3W3du1aINQpR7Vhc0N9JBU8pQT7ZCDlPX369ByviX/+859ApKvn22+/BbzrOT9mzYULF7Jy5coCjTcnTAkahmEYaUtClKBWbdHFl+NdjHn69OmccMIJQKhAdSJ+Iy+sWrXKrYRk65fqCy+bpk7KSpJXaPczzzyT1PEKKR2FxWvs8UL+gPDiBkFDyr1Xr15ApEqX3zrZSfEHQspdhRVyS5SXz0ldFGKhe0jBMLlRpUqVPI+zsFx88cXZLBDyX8fyY+tek7KQYm7UqFHgCuxHp6CIWEFqr776KuAlzyuY5oQTTnDqXYWqk9m3U5ak8F6a0UnusVTbkiVL8vwb0d/XuXNn9xzNz/fk+htx+RbDMAzDSEESogRli4/2fSnyLp7oN3L6zWTQokULpwSj/RNSWfPnz3cl4TRG+Uj9igrNzMwEvLBz9XkMj9LKCa3GMjMz3So7OvRZ0cDXXHNN3PxX8UIrTHWwjo44mzJliivN5TfyL6sgRLVq1QAvCluviUSpMeecc07Cf0sMGzbMlSMUUu6yplSrVo0ffvgB8Cwb0c+AJUuWuOL1qcIvv/ziUj4UIavnxEcffQR4KTKQ3M7ySn1TVGespHdt69+/P1DwazTWd8f7GW9K0DAMw0hbktpZXn6JeEVoHXTQQa7zsN8oqkvJxtFMmzbN+Z2kAKXA/EYrTPmW8luMWSo4umiuutKvX7++sEOMO0pWjlbuUsf/+c9/kj6mnFDOq+4bKfZ4IR9aeDk4RctKdSohP17l2fLCd99951ogCfktwwtiS53Gu7CynyxfvpyuXbvGfC9WeyL5BBOJEtjVuDlWYwT5zxVxrPxO8Aqk5FQE4LDDDnO5hLI0ifCE+nglyYukToKS7Mcccwxff/11ob+vRo0a2R5iQSW8qoxfKRGJQmabePeySyQ5BYGoPucHH3yQzOHkib59+wKeiUkLLpkIw1m+fDngBV81b948x4ldifF79+512xTYpUlQwRl+oaAXTQwyvY8YMaJITH7R11ssoaDUs/D6rfv27QO8nqSJRC6DWJPfwoULAa+LhBL9RYkSJahVqxbgma2j6d27t3MZyV0hc2ishPp4YeZQwzAMI21JiBL87LPPgFBocjiq8/nCCy/QsWNHAL766qsC/06sclz67aCgUmnNmjUjIyMDiF+Sc1Bo2bKl30PIF+3bt+eJJ56I2Pbmm28CwVazMgnpVSWz8oJM8AdCClmr9qAgBTh69GjAS5pPRKdxP1AqiEyOmzZtcu+pNOHAgQOByDqdUsFz5sxJyjhjsXDhQmedkGslP50edDwKAApHpleZUhOBKUHDMAwjbUmIEtRqWisYlRQT9evXdzbk2bNnA14/wXC/RE6onFDbtm3dNpXlCcpKXt3jVck+KysrWym1okJeimgHicmTJzsHvJBDf8eOHX4MKTB8+umnAGzcuBGILCTuB2XKlAG8YgYqwDB06FDfxhQP1J9T6kd+XRUGHzdunAv6UyGQCy+8MOI71q9fz8UXX5yM4UYQncC+cePGfKV5RQe/yCoYTjK7lpgSNAzDMNKWhChBdYS/7LLLAE+lhZflkn9Q/cBU+FptNmL1A2zTpg3ghQOXL1/evXfVVVcB/hdrlkpdvHgx4EW19enTJ+6hvUGgSpUqvqxGC4LKg0UXhwZvJa5VqlIl0pVt27YBXmK2omaTzbBhwwCvUIBa9iSzX2gimDRpEuA1E5Di7t69e8RrLBQxOWzYMP73v/8lcpi5/r4444wzXK/NnPojhiMF2KFDh4jve+utt3yxlpkSNAzDMNKWhOYJ/v7774DnE1Tui1Z14cjmr8Tgffv2ZVtxyPek102bNrloNpXo8ptwHyB4CfFF1R9YqlSpQrVZSia7du0CvNyqcJTAmxefdDqgaL8SJUKPiPBoxWSi54HOixRHqvP5558DXlyDyrqpIHasRgB6pshvuGDBgoSPMxzdI7qPlC/YpEkTmjRpAuBa2+VW2kz54mr7pLzBnj17+lJCslhugy1WrFhci7RVrFgRCAWNyDhjDwAAAAHMSURBVMkrU5qqi4T9do5/SE14N9xwQ0TiqJ/ccsstgJcSoeobMuEWVapVq+a6ziuIQciUNnr06LhVfI8H3377rasYI2SWkhnbCAZ6KE6bNg2AwYMH+zmchKNAmdNPP91N/Cp4sGXLFsDrbuIXWpjInNm7d2/3XnSSeyyi6/bmt0JVQcjKysqxvZCZQw3DMIy0JalKMBb16tUD4Oyzzwa8+qJnnnlmttWETAYqofPSSy8lenh5omvXrq7G4Z49ewA477zzgOTWWvQLldiKPh8qnfTkk08mfUxG6lOmTBk2b94MeKlUOdXmNZKPnt3hZetyU4IqdKBnoj6ndJxEYkrQMAzDMGKQ1ALasVCIr15VJieVaNeunSsuq/6B6aAAhcrAJTPB1Sj6lCtXznUgCU+HMoKBntklS5b0eSSFw5SgYRiGkbb4rgSLAllZWS7MNzxSyjCMgvPjjz/SvHlzv4dhFHFMCRqGYRhpi+/RoYZhGIaRSCw61DAMwzBiYJOgYRiGkbbYJGgYhmGkLTYJGoZhGGlLroExhmEYhlGUMSVoGIZhpC02CRqGYRhpi02ChmEYRtpik6BhGIaRttgkaBiGYaQtNgkahmEYacv/AXTS+shH5PNnAAAAAElFTkSuQmCC\n", "text/plain": [ "