{ "cells": [ { "cell_type": "markdown", "metadata": {}, "source": [ "## Optimal Nonparametric Density Estimation with `condensier` and `sl3`\n", "\n", "## Lab 09 for PH 290: Targeted Learning in Biomedical Big Data\n", "\n", "### Author: [Nima Hejazi](https://nimahejazi.org)\n", "\n", "### Date: 14 March 2018" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# I. Nonparametric Density Estimation" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "For a moment, we will go back to simple data structures: we have observations which are realizations of univariate random variables,\n", "$$X_1, \\ldots, X_n \\sim F,$$\n", "where $F$ denotes an unknown cumulative distribution function (CDF). The goal is to estimate the distribution $F$. In particular, we are interested in estimating the density $f = F′$ , assuming that it exists.\n", "\n", "### Histograms\n", "\n", "The histogram is the oldest and most popular density estimator. We need to specify an \"origin\" $x_0$ and the class width $h$ for the specifications of the intervals:\n", "$$I_j =(x_0 + j \\cdot h,x_0 + (j + 1) \\cdot h), (j = \\ldots, −1, 0, 1, \\ldots)$$\n", "\n", "### The Naïve Kernel Estimator\n", "\n", "$$\\hat{f}(x) = \\frac{1}{nh} \\sum_{i = 1}^{n} w \\left(\\frac{x - X_i}{h}\\right),$$\n", "where $w(x) = \\frac{1}{2}, \\mid X \\mid \\leq 1; 0, \\text{otherwise}$. This is merely a simple weight function that places a rectangular box around each interval $(x - h, x + h)$.\n", "\n", "By replaceing $w$ with a generalized smooth kernel function, we get the definition of the _kernel density estimator_:\n", "$$\\hat{f}(x) = \\frac{1}{nh} \\sum_{i = 1}^{n} K \\left(\\frac{x - X_i}{h}\\right),$$\n", "where $$K(x) \\geq 0, \\int_{-\\infty}^{\\infty} K(x) dx = 1, K(x) = K(-x).$$\n", "\n", "The positivity of the kernel function $K(\\cdot)$ guarantees a positive density estimate $f(\\cdot)$ and the normalization $K(x)dx = 1$ implies that $f(x)dx = 1$, which is necessary for $f(\\cdot)$ to be a density.\n", "Typically, the kernel function $K(\\cdot)$ is chosen as a probability density which is symmetric around $0$. Additionally, the smoothness of $f(\\cdot)$ is inherited from the smoothness of the kernel.\n", "\n", "In the above definition, we leave the bandwidth $h$ as a _tuning parameter_, which can be chosen so as to minimize an arbitrary distance metric that ensures the estimated function is optimal, given the available data. For large bandwidth $h$, the estimate $f(x)$ tends to be very slowly varying as a function of $x$, while small bandwidths will produce a more variable function estimate.\n", "\n", "### The Bandwidth $h$\n", "\n", "The bandwidth h is often also called the \"smoothing parameter\". It should be clear that for $h \\to 0$, we will have spikes at every observation $X_i$, whereas $f(\\cdot) = fh(\\cdot)$ becomes smoother as $h$ is increasing. In the above, we use a global bandwidth, which we might choose optimally using cross-validation, but, we can also use variable bandwidths (locally changing bandwidths $h(x)$), with the general idea her being to use a large bandwidth for regions where the data is sparse. With respect to the _bias-variance tradeoff_: **the (absolute value of the) bias of $\\hat{f}$ increases and the variance of $\\hat{f}$ decreases as $h$ increases**." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# II. Density Estimation with `condensier`" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "First, let's load the packages we'll be using and some core project management tools." ] }, { "cell_type": "code", "execution_count": 1, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Changing active project to lab_09\n", "✔ Writing a sentinel file '.here'\n", "● Build robust paths within your project via `here::here()`\n", "● Learn more at https://krlmlr.github.io/here/\n" ] }, { "name": "stderr", "output_type": "stream", "text": [ "here() starts at /Users/nimahejazi/Dropbox/UC_Berkeley-grad/teaching/2018_Spring/tlbbd-labs/lab_09\n", "── Attaching packages ─────────────────────────────────────── tidyverse 1.2.1 ──\n", "✔ ggplot2 2.2.1 ✔ purrr 0.2.4\n", "✔ tibble 1.4.2 ✔ dplyr 0.7.4\n", "✔ tidyr 0.8.0 ✔ stringr 1.3.0\n", "✔ readr 1.1.1 ✔ forcats 0.3.0\n", "── Conflicts ────────────────────────────────────────── tidyverse_conflicts() ──\n", "✖ dplyr::filter() masks stats::filter()\n", "✖ dplyr::lag() masks stats::lag()\n", "\n", "Attaching package: ‘data.table’\n", "\n", "The following objects are masked from ‘package:dplyr’:\n", "\n", " between, first, last\n", "\n", "The following object is masked from ‘package:purrr’:\n", "\n", " transpose\n", "\n" ] } ], "source": [ "library(usethis)\n", "usethis::create_project(\".\")\n", "library(here)\n", "library(tidyverse)\n", "library(data.table, quietly = TRUE)" ] }, { "cell_type": "code", "execution_count": 2, "metadata": {}, "outputs": [ { "name": "stderr", "output_type": "stream", "text": [ "condensier\n", "The condensier package is still in beta testing. Interpret results with caution.\n" ] } ], "source": [ "library(simcausal)\n", "library(condensier)\n", "library(sl3)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "We begin by simulating a simple data set and illustrating a simple execution of how to use `condensier` to perform conditional density estimation:" ] }, { "cell_type": "code", "execution_count": 3, "metadata": {}, "outputs": [ { "name": "stderr", "output_type": "stream", "text": [ "...automatically assigning order attribute to some nodes...\n", "node W1, order:1\n", "node W2, order:2\n", "node W3, order:3\n", "node A, order:4\n" ] } ], "source": [ "D <- DAG.empty()\n", "D <- D + node(\"W1\", distr = \"rbern\", prob = 0.5) +\n", " node(\"W2\", distr = \"rbern\", prob = 0.3) +\n", " node(\"W3\", distr = \"rbern\", prob = 0.3) +\n", " node(\"A\", distr = \"rnorm\", mean = (0.98 * W1 + 0.58 * W2 + 0.33 * W3), sd = 1)\n", "D <- set.DAG(D, n.test = 10)" ] }, { "cell_type": "code", "execution_count": 4, "metadata": {}, "outputs": [ { "name": "stderr", "output_type": "stream", "text": [ "using the following vertex attributes: \n", "120.8NAdarkbluenone0\n", "using the following edge attributes: \n", "0.50.40.8black1\n" ] }, { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAA0gAAANICAYAAAD958/bAAAEGWlDQ1BrQ0dDb2xvclNwYWNl\nR2VuZXJpY1JHQgAAOI2NVV1oHFUUPrtzZyMkzlNsNIV0qD8NJQ2TVjShtLp/3d02bpZJNtoi\n6GT27s6Yyc44M7v9oU9FUHwx6psUxL+3gCAo9Q/bPrQvlQol2tQgKD60+INQ6Ium65k7M5lp\nurHeZe58853vnnvuuWfvBei5qliWkRQBFpquLRcy4nOHj4g9K5CEh6AXBqFXUR0rXalMAjZP\nC3e1W99Dwntf2dXd/p+tt0YdFSBxH2Kz5qgLiI8B8KdVy3YBevqRHz/qWh72Yui3MUDEL3q4\n4WPXw3M+fo1pZuQs4tOIBVVTaoiXEI/MxfhGDPsxsNZfoE1q66ro5aJim3XdoLFw72H+n23B\naIXzbcOnz5mfPoTvYVz7KzUl5+FRxEuqkp9G/Ajia219thzg25abkRE/BpDc3pqvphHvRFys\n2weqvp+krbWKIX7nhDbzLOItiM8358pTwdirqpPFnMF2xLc1WvLyOwTAibpbmvHHcvttU57y\n5+XqNZrLe3lE/Pq8eUj2fXKfOe3pfOjzhJYtB/yll5SDFcSDiH+hRkH25+L+sdxKEAMZahrl\nSX8ukqMOWy/jXW2m6M9LDBc31B9LFuv6gVKg/0Szi3KAr1kGq1GMjU/aLbnq6/lRxc4XfJ98\nhTargX++DbMJBSiYMIe9Ck1YAxFkKEAG3xbYaKmDDgYyFK0UGYpfoWYXG+fAPPI6tJnNwb7C\nlP7IyF+D+bjOtCpkhz6CFrIa/I6sFtNl8auFXGMTP34sNwI/JhkgEtmDz14ySfaRcTIBInmK\nPE32kxyyE2Tv+thKbEVePDfW/byMM1Kmm0XdObS7oGD/MypMXFPXrCwOtoYjyyn7BV29/MZf\nsVzpLDdRtuIZnbpXzvlf+ev8MvYr/Gqk4H/kV/G3csdazLuyTMPsbFhzd1UabQbjFvDRmcWJ\nxR3zcfHkVw9GfpbJmeev9F08WW8uDkaslwX6avlWGU6NRKz0g/SHtCy9J30o/ca9zX3Kfc19\nzn3BXQKRO8ud477hLnAfc1/G9mrzGlrfexZ5GLdn6ZZrrEohI2wVHhZywjbhUWEy8icMCGNC\nUdiBlq3r+xafL549HQ5jH+an+1y+LlYBifuxAvRN/lVVVOlwlCkdVm9NOL5BE4wkQ2SMlDZU\n97hX86EilU/lUmkQUztTE6mx1EEPh7OmdqBtAvv8HdWpbrJS6tJj3n0CWdM6busNzRV3S9KT\nYhqvNiqWmuroiKgYhshMjmhTh9ptWhsF7970j/SbMrsPE1suR5z7DMC+P/Hs+y7ijrQAlhyA\ngccjbhjPygfeBTjzhNqy28EdkUh8C+DU9+z2v/oyeH791OncxHOs5y2AtTc7nb/f73TWPkD/\nqwBnjX8BoJ98VQNcC+8AAEAASURBVHgB7d0HnCRXfSfwGgkkLAl8Bkw0RkKAgSOZbOMDESRy\ntsAYTFyJwz5hgqTVWQSJYJBEkI3BwJGOnI3PAQOHTwSRozHYJJEMCBAyWAgjQDv3fzvbuz27\n3TPd011V77361ufzdrqrK7z3/bfU85t+Xd00FgIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAA\nAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAA\nAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAA\nAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAA\nAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAA\nAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAA\nAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAA\nAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAA\nAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAA\nAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAA\nAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAA\nAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAA\nAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAA\nAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAA\nAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAA\nAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAA\nAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAA\nAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAA\nAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAA\nAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAA\nAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAA\nAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAA\nAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAA\nAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAA\nAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAA\nAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAA\nAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAA\nAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAA\nAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAA\nAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAA\nAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAA\nAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAA\nAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAA\nAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAA\nAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAA\nAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAA\nAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAA\nAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAA\nAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAA\nAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAA\nAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAA\nAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAA\nAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAA\nAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAA\nAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAA\nAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAA\nAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAA\nAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAA\nAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAA\nAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAA\nAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAA\nAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAA\nAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAA\nAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAA\nAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAA\nAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAA\nAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAA\nAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAA\nAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAA\nAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAA\nAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAA\nAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAA\nAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAA\nAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAA\nAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAA\nAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAA\nAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAA\nAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAA\nAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAA\nAQIECBAg0InASU9omj88pJNTOck6gf3X3XOHAAECBAgQIECAAIGeBY4/uGlW/rppDvz3pjn7\noz13ZnCn329wIzZgAgQIECBAgAABAlkL7P+Q6N4FEZIeEz/9vt5xrYB3DO50BAgQIECAAAEC\n2QlcIq8erdy5aXY8Kvp0zaY58e559a3+3ghI9dfYCAkQIECAAAECBKYLHBAP/Ue086KdFe3P\nox0XLQWT60e7bLQOl5NuHyd7X9Oc8Y74+eV4F+mxHZ7cqUIgs7SsJgQIECBAgAABAgQ6FfhF\nnO2XdrXbxs/fjpbWXTLa6Hfln8XtFKJ+uKul++mz/OnNhr1/rsa6P472j9G2smxrmov+R+yY\njvPCCEjPbZrtN2qa0z6zlYPZZ36BUdHn39MeBAgQIECAAAECBPIXSCEmBaBL7WqTbqfAk95J\nSksKRqmNL+mxy+9q4+vHb++IO/G5oZ3vRI2vn+P2E64euSjOdcmDYmpdtF+8O/LXz+MA6V2k\nh89xIJsuILCywL52JUCAAAECBAgQINCGQAookwJNWjcp4Gy0Pr3D85/RfrqrTbr94njskGh7\nLymcpHdyUkBKIeor0T4Z7QvRvr1XOy/up20XWLafFjvHFexWf7LnIPvdJu7fOLof4enPvrtn\nvVttCXgHqS1ZxyVAgAABAgQIDEsghYh5w8u07VPQmBRkUsgZXx+Xwd4w+KTtU8jZbHlmbJAC\nUgpBo2lzX4rb74mWLrOdWgpFF0draTk23jFqfjOm0h21/gTH3zJm+n04aB8d609Z/5h7bQh4\nB6kNVcckQIAAAQIECJQhMOldmmmhZbP1KYjsHWDGw8w8t1sMIhMLc1asvVa0+O6h5t3RzoqW\nwleHy/b43NJq/G5++pn7nnT7P8dnka7QNBdes2menz4LZWlRQEBqEdehCRAgQIAAAQJLFki/\nu20WVDaabja+74FxrIuibTT1bJ7As+D0siVLzXe45Npm/68Sx08B8vuTu7U9fe9RunreWfEm\n1eOa5tlf3bPdCfGO0n7pYg2HRxcjwF306KY58zt7Hndr2QIC0rJFHY8AAQIECBAgsF4gfQZm\nPJgscjt9Nmeed2L2Djh731/fU/eWLXCrOODTot0+2tuj3S+aJXMBASnzAukeAQIECBAg0IvA\nAXHWSdPPZn13ZjwEpd+3Fgk14/umz8hY8hZIn/G/b7RTol03Wnpn6p3RtkX7VjRL5gICUuYF\n0j0CBAgQIEBgZoFRoNkoxGz02CjUpJ+jz9Ps/Y7L3vfHw8u02+k7dSz1C/yXGGIKQdujpUuC\npyU+O9Q8ItrH0p0pS3pn6Vej3XrK41Z3LJASroUAAQIECBAg0IdA+kPtKJTMElxGAWjSPrN+\nniYFnPQh9/RzWthJQafNz6PE4S2VCKQpj3eJ9tBo94iW7qflu9HSl72+Jd3ZYInPGzX3iva8\nDbbxUMcC3kHqGNzpCBAgQIBA4QLp8zSzhplJQWZ83/TL5LSQspX1hdPqfiEC6Xl9ZLTfjXbv\naOny3OnS4KOplE+O2+mCC5tNh7xhbJO+U+mH0a4WLQVzSwYC3kHKoAi6QIAAAQIEWhZIQWSj\nd1/GQ8tmt9MvgRu9+zIKNhfEdumKXdOmnaX1m/0CGZtYCPQukJ7z1492h2h3ixZf3LozEMWP\nJv0und5tTO3voh0T7dxomy0HxwbviJb+4PDIaMJRIOSyCEi5VEI/CBAgQIDAeoE0ZWxSqNks\nwOy9T/prd/pOmVGoGQWY0f3xAJO+92XS+vFt0mdzLASGIvCMGGiaKneZaOm/jfTf5fgMrPTf\nw0+iPSraG6PNurw8NkyX/n5vtPTdS5aMBASkjIqhKwQIECBQtED6pWkUTjYKMRs9Nj4lLb27\nMimspHXjgSW9SzNpu9E26eeOaBYCBOYXOCR2Se/2pCX9tzu+pItvvD/ag6N9Z/yBTW6nzyvd\nP1raP717ZMlMYDwBZ9Y13SFAgACBPQInxnz3lfus3V9NL8Qvi/vXiJ/pUrLxor0Sc9gvjg8D\nn5Hms8dyYrwAr9wy2vlxMa4zm+Y55zXNCTeJWSEPigevG+vf3TRfiTnyb+762+p39i6jf9Ln\nBsZDySy3pwWcdFnoUVDZO8SMwsro8fH7026nKTsWAgT6FbhynP7r0dI01dEy+v/mSbHiOdHm\n+W/1+rH9J6Kl/188LVr6vJIlMwEBKbOC6A4BAgSmC2w/NYJNvJiuRvg57VVr26UgtN8rY91T\nY91TxvaNX/y3fzH+QHnPeP3+fASm+DDxyrNiu1fHzyPW2upfxj5/OLZPKTfT7IdZgsy0bcYD\nzv5xrGkBZdr6aeHnolIA9ZMAgZkEfjO2ShdhuHO0G0dLoSa9sxt/kNq5/kPxc57lirHxp6Nd\nKdp3ox0WLf1/xpKZgICUWUF0hwABAtMFTrx0BJv07lF8EPi0B6xtd3T8gn/41+P2t5vmWbfY\ns+8fXa5pDnlRbHf02rrt8VfOc07c847R9nfFsW7bNOfH9JGXdPGZkvSLxbTAMu/61TjWtPAy\ny/rxgNPF2PeUxS0CBEoQGAWj1Ne3Rzs32teipT+ofCzavaOlgDPPkv4/d3a0G0ZL71w/Mtor\nolkyFPAZpAyLoksECBCYLHB6XBVs+xvjsZgmlwLQC36wFni2x3S5lYfFYzeIQPTZtX0PjgC1\n8oa128fH/PkdL9wTjtLa1TjOyh3iRpo2MktISH/pHH/nZd7b6RzjwWTa7ZgKuGn4GU1viU0t\nBAgQWJrA3sHoU2NHjnfpd15U4bHxc5b/Z47tujNYvSVWpOl1KRx9I9qrolkyFRCQMi2MbhEg\nQGCKQPzFceUR8ZnhB8Xj8Rmi9A5Sc50IPOldlVjfPC5aLCsxte7C9OWDsTz7wvjnKztv7v5n\nJb63Y/Wj8e7RT3av2vjGH8TDKdRMCjY/mrJ+fPvUPwsBAgRyFNgoGI36+4zRjTl/ptla6Yp1\nR0ZLf5BKF2Z4UjR/6AmEXBcBKdfK6BcBAgQmCpz2gaY56Zy1kJQC0qF3is3eGvd/HIEnrqR0\n9PaY1h7v9qx+s2mev9FnYtIXHJ4w8RSTVz518mprCRAgUKzALMFo0cE9Pw7wwGgpHKXle9Fe\nv/OWf7IVEJCyLY2OESBAYKrAqyMQPSWC0k1ji4fE9LnHR4tAtH9MqTss3jlaiRf9lddO3bvZ\nHvusvrlpTo+wZSFAgMDgBLoIRgn1edEeFW30+3a6wMMzo6V3kSwZC6S3/SwECBAgUJTACYdH\nGPpydDl9Hik+X/SsezTNcQc2zUFxoYaVj0f4ielzp90mHpswre3Em8U2sf26K94VNXqdJUCA\nwBYFugpG6ffreIe/+e/RRuEodTne6d95Bbs07dmSscB+GfdN1wgQIEBgosAZ6fNEH4kWF2LY\n8aa1TdJ0upXXxO2jon0w2oRwdHxMvVu56/pwdNKvrO3vXwIECFQrkILRqdHS1efSVeniHfhm\n/AIMcXdpSwpHL482/s5ROnia8vyiaMJR0sh8GU+1mXdV9wgQIEBgj8Dq6+N2XBFpx9v2rLv4\nFfHO0mPWps/tWbt264QrxWOnxWNnxnci3WrXo1eI/W8Ut5+299buEyBAoAKBrt4xGqdKf3S6\n7/iKXbfTVx2kgGQpQEBAKqBIukiAAIF9BS6Kd44uFS/+O69Qt+vhMz4dn0v6+/hsUUyzG192\nfn9SfO9RE5cBXzk62viDtxu/4zYBAgQqEOgjGI3Yzo8b140WnxVtjoiWZmvtiPa+aOndfwsB\nAgQIECDQnsBxl9n32JPW7buVNQQIEKhQIAWjNJUutXS7z+X4OPlzo10QLU15/r1oFgIECBAg\nQIAAAQIECLQukFMwSoO9bbQnR0tv11812inR4kI6FgIECBAgQIAAAQIECLQnkFswSiO9bLR0\nBburpDuWMgV8BqnMuuk1AQIECBAgQGCoAikYpSvSpSVdla6tK9LtPMGc/zwytv+HaPG1C5ZS\nBQSkUiun3wQIECBAgACBYQnkHIxSJY6Idqlo74hmKVhAQCq4eLpOgAABAgQIEBiAQO7BKJXg\nctHS5b2fGS1dlMFSsICAVHDxdJ0AAQIECBAgULFACcFoxJ+m1qV3jr4zWuFnuQICUrm103MC\nBAgQIECAQI0CJQWj5H9EtHSVuvTZI0sFAgJSBUU0BAIECBAgQIBABQKlBaNEnqbW3S/an0Yz\ntS4QalgEpBqqaAwECBAgQIAAgXIFSgxGI+00te7vo5laNxKp4KeAVEERDYEAAQIECBAgUKBA\nycEocd8umql1BT7xNuuygLSZkMcJECBAgAABAgSWKVB6MEoWo6vWmVq3zGdGJscSkDIphG4Q\nIECAAAECBCoXqCEYjUpkat1IosKfAlKFRTUkAgQIECBAgEBGAjUFo8Rqal1GT642uiIgtaHq\nmAQIECBAgAABArUFo1TRy0e7T7RnRnPVukCocRGQaqyqMREgQIAAAQIE+hOoMRiNNB8RN1y1\nbqRR6U8BqdLCGhYBAgQIECBAoGOBmoNRorx9tAOivTPdsdQrICDVW1sjI0CAAAECBAh0IVB7\nMEqGo6l1z4jbptZ18azq8RwCUo/4Tk2AAAECBAgQKFhgCMFoVJ501bq/jXbuaIWf9QoISPXW\n1sgIECBAgAABAm0IDCkYJb80tS79zvyudMdSv4CAVH+NjZAAAQIECBAgsAyBoQWjZDaaWvf0\nuG1q3TKeRQUcQ0AqoEi6SIAAAQIECBDoUWCIwWjEvS1upKl13x2t8LN+AQGp/hobIQECBAgQ\nIEBgKwJDDkbJK02t2z+aqXVJY0CLgDSgYhsqAQIECBAgQGAGgaEHo0Rkat0MT5RaNxGQaq2s\ncREgQIAAAQIE5hMQjPZ4mVq3x2JwtwSkwZXcgAkQIECAAAEC6wQEo3UczR3irql1600GdU9A\nGlS5DZYAAQIECBAgsFtAMNpNsfvGr8ate0XzhbC7SYZ3Q0AaXs2NmAABAgQIEBi2gGA0vf6j\nL4R11brpRtU/IiBVX2IDJECAAAECBAjsFBCMNn4i3DEe3i/auzfezKO1CwhItVfY+AgQIECA\nAIGhCwhGmz8DRlPrnh6b+kLYzb2q3kJAqrq8BkeAAAECBAgMWEAwmr3422LTv4lmat3sZtVu\nKSBVW1oDI0CAAAECBAYqIBjNV/g77trc1Lr53KrdWkCqtrQGRoAAAQIECAxMQDCav+CjqXVP\ni11NrZvfr8o9BKQqy2pQBAgQIECAwIAEBKOtF3tb7Pp/on1v64ewZ20CAlJtFTUeAgQIECBA\nYCgCgtFilT5y1+6m1i3mWN3eAlJ1JTUgAgQIECBAoHIBwWjxAl8hDnGPaE9f/FCOUJuAgFRb\nRY2HAAECBAgQqFVAMFpeZbfFoUytW55nVUcSkKoqp8EQIECAAAECFQoIRsst6lFxuHRBhv+7\n3MM6Wi0CAlItlTQOAgQIECBAoDYBwWj5FR1NrXvq8g/tiLUICEi1VNI4CBAgQIAAgVoEBKP2\nKnlMHPqvo32/vVM4cukCAlLpFdR/AgQIECBAoBYBwajdSqapdTuimVrXrnPxRxeQii+hARAg\nQIAAAQKFCwhG7RfQ1Lr2jas5g4BUTSkNhAABAgQIEChMQDDqpmArcZpt0Uyt68a7+LMISMWX\n0AAIECBAgACBwgQEo24Llr4Q1tS6bs2LPpuAVHT5dJ4AAQIECBAoSEAw6r5YV4xTpi+EddW6\n7u2LPaOAVGzpdJwAAQIECBAoREAw6qdQo6l1b4/Tu2pdPzUo8qwCUpFl02kCBAgQIECgAAHB\nqN8ipavWXRztPf12w9lLExCQSquY/hIgQIAAAQK5CwhG/VcoTa27W7Sn9d8VPShNQEAqrWL6\nS4AAAQIECOQqIBjlURlT6/KoQ7G9EJCKLZ2OEyBAgAABApkICEaZFGJXN9LUul9E+8e8uqU3\npQgISKVUSj8JECBAgACB3AQEo9wq0jRpat3do52aX9f0qBQBAamUSuknAQIECBAgkIuAYJRL\nJdb3I02tOybaX0U7b/1D7hGYXUBAmt3KlgQIECBAgMCwBQSjvOt/p+jez6KZWpd3nbLvnYCU\nfYl0kAABAgQIEOhZQDDquQAznP5KsU26ap2pdTNg2WRjAQFpYx+PEiBAgAABAsMVEIzKqP1o\nat3borum1pVRs6x7KSBlXR6dI0CAAAECBHoQEIx6QF/glGlq3UXR/t8Cx7Argd0CAtJuCjcI\nECBAgACBgQsIRuU9AUytK69m2fdYQMq+RDpIgAABAgQItCwgGLUM3NLhTa1rCXbohxWQhv4M\nMH4CBAgQIDBcAcGo7NrfObpval3ZNcyy9wJSlmXRKQIECBAgQKBFAcGoRdyODn3lOM9doj21\no/M5zYAEBKQBFdtQCRAgQIDAwAUEozqeAGlq3bZovhC2jnpmNwoBKbuS6BABAgQIECCwZAHB\naMmgPR8uTa37aTRXreu5ELWeXkCqtbLGRYAAAQIECAhG9T0H0tS6u0Y7pb6hGVEuAgJSLpXQ\nDwIECBAgQGBZAoLRsiTzOs5oat1bo1s/yKtrelOTgIBUUzWNhQABAgQIDFtAMKq7/umiDGlq\n3Vl1D9Po+hYQkPqugPMTIECAAAECiwoIRosK5r9/mlqXPnt0av5d1cPSBQSk0iuo/wQIECBA\nYLgCgtEwaj/6QlhT64ZR795HKSD1XgIdIECAAAECBOYUEIzmBCt883RRhp9Ee2/h49D9QgQE\npEIKpZsECBAgQIBAIxgN70lwlRjynaKdMryhG3FfAgJSX/LOS4AAAQIECMwqIBjNKlXXduNX\nrTu/rqEZTc4CAlLO1dE3AgQIECAwbAHBaNj1N7Vu2PXvbfQCUm/0TkyAAAECBAhMERCMpsAM\naHWaWpeuWveUAY3ZUDMREJAyKYRuECBAgAABAj5j5DmwU2A0te4tcc/UOk+KzgUEpM7JnZAA\nAQIECBDYS8A7RnuBDPyuqXUDfwL0PXwBqe8KOD8BAgQIEBiugGA03NpPG7mpddNkrO9MQEDq\njNqJCBAgQIAAgV0CgpGnwiSB0RfCvjkeNLVukpB1nQgISJ0wOwkBAgQIECAQAoKRp8FGAneL\nB38c7X0bbeQxAm0LCEhtCzs+AQIECBAgIBh5DmwmkKbWHRXtlM029DiBtgUEpLaFHZ8AAQIE\nCAxXQDAabu3nGfl+sfEx0Uytm0fNtq0JCEit0TowAQIECBAYrIBgNNjSb2ng6ap1aWrd+7e0\nt50ILFlAQFoyqMMRIECAAIEBCwhGAy7+Fod+1dgvTa3zhbBbBLTb8gUEpOWbOiIBAgQIEBia\ngGA0tIovZ7zjU+v+fTmHdBQCiwsISIsbOgIBAgQIEBiqgGA01MovZ9zpqnX/Ec3UuuV4OsqS\nBASkJUE6DAECBAgQGJCAYDSgYrc01DS17shopta1BOywWxcQkLZuZ08CBAgQIDA0AcFoaBVv\nZ7yjqXVvisObWteOsaMuICAgLYBnVwIECBAgMBABwWgghe5omHeP8/wo2gc6Op/TEJhLQECa\ni8vGBAgQIEBgUAKC0aDK3clgR1PrntTJ2ZyEwBYEBKQtoNmFAAECBAhULiAYVV7gnoaXptYd\nG+2N0X7YUx+clsCmAgLSpkQ2IECAAAECgxEQjAZT6l4GmqbWpWBkal0v/E46q4CANKuU7QgQ\nIECAQL0CglG9tc1lZL8WHbljtCfn0iH9IDBNQECaJmM9AQIECBCoX0Awqr/GOYxw/Kp1ptbl\nUBF92FBAQNqQx4MECBAgQKBKAcGoyrJmOyhT67ItjY5NEhCQJqlYR4AAAQIE6hQQjOqsa86j\nMrUu5+ro20QBAWkii5UECBAgQKAqAcGoqnIWMxhXrSumVDo6LiAgjWu4TYAAAQIE6hIQjOqq\nZ2mjuUd0+PxoZ5fWcf0dtoCANOz6Gz0BAgQI1CkgGNVZ15JGdbXo7B2iuWpdSVXT150CApIn\nAgECBAgQqEdAMKqnliWPZP/o/DHR3hDNVetKruRA+y4gDbTwhk2AAAECVQkIRlWVs/jBpKl1\nP4j2weJHYgCDFBCQBll2gyZAgACBSgQEo0oKWdEw0tS620d7UkVjMpSBCQhIAyu44RIgQIBA\nFQKCURVlrG4Q41PrflTd6AxoMAIC0mBKbaAECBAgUIGAYFRBESsegql1FRd3SEMTkIZUbWMl\nQIAAgVIFBKNSKzecfv96DNXUuuHUu+qRCkhVl9fgCBAgQKBwAcGo8AIOpPujqXWvj/GaWjeQ\notc8TAGp5uoaGwECBAiUKiAYlVq5YfY7Ta07L9qHhjl8o65NQECqraLGQ4AAAQIlCwhGJVdv\nmH03tW6Yda961AJS1eU1OAIECBAoREAwKqRQurlOwNS6dRzu1CIgINVSSeMgQIAAgRIFBKMS\nq6bPI4F7xo3vRzO1biTiZxUCAlIVZTQIAgQIEChMQDAqrGC6u49Amlp3u2i+EHYfGitKFxCQ\nSq+g/hMgQIBASQKCUUnV0tdpAqOpda+LDX40bSPrCZQqICCVWjn9JkCAAIGSBASjkqqlr5sJ\njKbWfXizDT1OoEQBAanEqukzAQIECJQiIBiVUin9nFXg6rHhEdGePOsOtiNQmoCAVFrF9JcA\nAQIEShAQjEqokj7OK5Cm1m2L5gth55WzfVECAlJR5dJZAgQIEMhcQDDKvEC6t5DAvWLv70Uz\ntW4hRjvnLiAg5V4h/SNAgACBEgQEoxKqpI+LCIym1j1xkYPYl0AJAgJSCVXSRwIECBDIVUAw\nyrUy+rVMgdFV614bB/2PZR7YsQjkKCAg5VgVfSJAgACB3AUEo9wrpH/LFLh3HOzcaB9Z5kEd\ni0CuAgJSrpXRLwIECBDIUUAwyrEq+tSmQJpad5tovhC2TWXHzkpAQMqqHDpDgAABApkKCEaZ\nFka3WhVIU+uOjZa+ENbUulapHTwnAQEpp2roCwECBAjkJiAY5VYR/elSIE2t+040U+u6VHeu\n3gUEpN5LoAMECBAgkKGAYJRhUXSpU4E0te620U7u9KxORiADAQEpgyLoAgECBAhkIyAYZVMK\nHelRYDS1Ll217oIe++HUBHoREJB6YXdSAgQIEMhMQDDKrCC606uAqXW98jt53wICUt8VcH4C\nBAgQ6FNAMOpT37lzFDg0OpWuWucLYXOsjj51IiAgdcLsJAQIECCQmYBglFlBdCcLgfR74THR\nTK3Lohw60ZeAgNSXvPMSIECAQB8CglEf6s5ZikCaWvftaB8tpcP6SaANAQGpDVXHJECAAIHc\nBASj3CqiP7kJHBYd+m/RTK3LrTL607mAgNQ5uRMSIECAQIcCglGH2E5VrED6fXBbtNdEc9W6\nYsuo48sSEJCWJek4BAgQIJCTgGCUUzX0JXeB+0QHvxXtY7l3VP8IdCEgIHWh7BwECBAg0JWA\nYNSVtPPUIpCm1v1ONFPraqmocSwsICAtTOgABAgQIJCBgGCUQRF0oTiB9Htgumrda6KZWldc\n+XS4LQEBqS1ZxyVAgACBLgQEoy6UnaNWgTS17t+imVpXa4WNa0sCAtKW2OxEgAABAj0LCEY9\nF8Dpixcwta74EhpAWwICUluyjkuAAAECbQgIRm2oOubQBNLvf8dGe3U0U+uGVn3j3VRAQNqU\nyAYECBAgkIGAYJRBEXShGoH7xki+Ee3j1YzIQAgsUUBAWiKmQxEgQIDA0gUEo6WTOuDABdLU\nultHO3ngDoZPYKqAgDSVxgMECBAg0KOAYNQjvlNXKzCaWveqGOGPqx2lgRFYUEBAWhDQ7gQI\nECCwVAHBaKmcDkZgncBoat0n1q11hwCBdQIC0joOdwgQIECgJwHBqCd4px2MwDVipKbWDabc\nBrqIgIC0iJ59CRAgQGBRAcFoUUH7E9hc4JKxSfpCWFPrNreyBYFGQPIkIECAAIE+BASjPtSd\nc6gC94mBp6vWmVo31GeAcc8lICDNxWVjAgQIEFhQQDBaENDuBOYUODy2/+1oT5xzP5sTGKyA\ngDTY0hs4AQIEOhUQjDrldjICOwXS1Lpt0dIXwrpq3U4S/xDYXEBA2tzIFgQIECCwdQHBaOt2\n9iSwqEC6at3Xo5lat6ik/QclICANqtwGS4AAgc4EBKPOqJ2IwESBNLXut6L5QtiJPFYSmC4g\nIE238QgBAgQIzC8gGM1vZg8CyxYYv2rdhcs+uOMRqF1AQKq9wsZHgACBbgQEo26cnYXALAL3\ni42+Gu2Ts2xsGwIE1gsISOs93CNAgACB+QQEo/m8bE2gbYFrxgluGc1V69qWdvxqBQSkaktr\nYAQIEGhVQDBqldfBCWxJYPyqdabWbYnQTgQaXxTrSUCAAAECcwkIRnNx2ZhApwKm1nXK7WS1\nCngHqdbKGhcBAgSWKyAYLdfT0QgsWyBNrbtVNFetW7as4w1OQEAaXMkNmAABAnMJCEZzcdmY\nQC8Co6l1/zvObmpdLyVw0poEBKSaqmksBAgQWJ6AYLQ8S0ci0LbAaGrdp9o+keMTGIKAgDSE\nKhsjAQIEZhcQjGa3siWBHARMrcuhCvpQlYCAVFU5DYYAAQJbFhCMtkxnRwK9CZha1xu9E9cs\nICDVXF1jI0CAwOYCgtHmRrYgkKvA70bHzolmal2uFdKvIgUEpCLLptMECBBYWEAwWpjQAQj0\nKpCm1t0imi+E7bUMTl6jgIBUY1WNiQABAtMFBKPpNh4hUIpAmlp3TDRXrSulYvpZlICAVFS5\ndJYAAQJbFhCMtkxnRwLZCRwdPfpytE9n1zMdIlCBgIBUQRENgQABAhsICEYb4HiIQIEC14o+\n3zyaL4QtsHi6XIaAgFRGnfSSAAEC8woIRvOK2Z5A/gIHRBe3RUtT636Sf3f1kECZAgJSmXXT\nawIECEwTEIymyVhPoHyBdNU6U+vKr6MRZC4gIGVeIN0jQIDAjAKC0YxQNiNQqMC1o9+m1hVa\nPN0uS0BAKqteekuAAIG9BQSjvUXcJ1CfwGhq3StjaKbW1VdfI8pMQEDKrCC6Q4AAgRkFBKMZ\noWxGoAKBdNW6L0b7TAVjMQQC2QsISNmXSAcJECCwTkAwWsfhDoHqBdLUuptFc9W66kttgLkI\nCEi5VEI/CBAgsLGAYLSxj0cJ1Cgwmlr3ihicqXU1VtiYshQQkLIsi04RIEBgt4BgtJvCDQKD\nExhNrfunwY3cgAn0KCAg9Yjv1AQIENhAQDDaAMdDBAYgYGrdAIpsiHkKCEh51kWvCBAYroBg\nNNzaGzmBkUCaWndMNFPrRiJ+EuhQQEDqENupCBAgsIGAYLQBjocIDEwgTa37QjRT6wZWeMPN\nQ0BAyqMOekGAwHAFBKPh1t7ICUwS+I1YedNoT5z0oHUECLQvICC1b+wMBAgQmCQgGE1SsY7A\nsAXS1LpHRntlNFetCwQLgT4EBKQ+1J2TAIEhCwhGQ66+sRPYWOD+8bCpdRsbeZRA6wICUuvE\nTkCAAIGdAoKRJwIBAhsJXCceTP+feNJGG3mMAIH2BQSk9o2dgQCBYQsIRsOuv9ETmEXA1LpZ\nlGxDoCMBAakjaKchQGBwAoLR4EpuwAS2LPCA2PNfon12y0ewIwECSxMQkJZG6UAECBDYKSAY\neSIQIDCPwGhq3cnz7GRbAgTaExCQ2rN1ZAIEhiUgGA2r3kZLYBkCo6l16Qth/3MZB3QMAgQW\nFxCQFjd0BAIEhi0gGA27/kZPYBEBU+sW0bMvgZYEBKSWYB2WAIHqBQSj6ktsgARaFUhT624c\nzRfCtsrs4ATmFxCQ5jezBwECwxYQjIZdf6MnsAyBA+Mgoy+ENbVuGaKOQWCJAgLSEjEdigCB\nqgUEo6rLa3AEOhVIU+s+H81V6zpldzICswkISLM52YoAgeEKCEbDrb2RE2hDYDS17k/aOLhj\nEiCwuICAtLihIxAgUKeAYFRnXY2KQJ8CaWrdtmgvj/bTPjvi3AQITBcQkKbbeIQAgWEKCEbD\nrLtRE+hCIE2t+1y0f+7iZM5BgMDWBASkrbnZiwCB+gQEo/pqakQEchK4bnTmRtF8IWxOVdEX\nAhMEBKQJKFYRIDAoAcFoUOU2WAK9CIyuWpe+ENbUul5K4KQEZhcQkGa3siUBAnUJCEZ11dNo\nCOQs8HvRuTStztS6nKukbwR2CQhIngoECAxNQDAaWsWNl0C/Amlq3Q2jmVrXbx2cncDMAgLS\nzFQ2JECgcAHBqPAC6j6BAgUuFX1OXwjrqnUFFk+XhysgIA239kZOYCgCgtFQKm2cBPITSFet\nS9Pq0pXrLAQIFCIgIBVSKN0kQGBuAcFobjI7ECCwRIHrxbFMrVsiqEMR6EpAQOpK2nkIEOhK\nQDDqStp5CBCYJpCm1j0imql104SsJ5CxgICUcXF0jQCBuQQEo7m4bEyAQIsC6ap1n41mal2L\nyA5NoC0BAaktWcclQKArAcGoK2nnIUBgFoH/GhtdP9oTZ9nYNgQI5CcgIOVXEz0iQGA2AcFo\nNidbESDQncBoat3L4pS+ELY7d2cisFQBAWmpnA5GgEAHAoJRB8hOQYDAlgTS1Lp/ivb5Le1t\nJwIEshAQkLIog04QIDCDgGA0A5JNCBDoTcDUut7onZjAcgUEpOV6OhoBAssXEIyWb+qIBAgs\nV2A0te6lcVhT65Zr62gEOhcQkDond0ICBGYUEIxmhLIZAQK9C4ym1v1L7z3RAQIEFhYQkBYm\ndAACBJYsIBgtGdThCBBoVWA0te7kVs/i4AQIdCYgIHVG7UQECGwiIBhtAuRhAgSyExifWndR\ndr3TIQIEtiQgIG2JzU4ECCxRQDBaIqZDESDQqcAD42yfiWZqXafsTkagXQEBqV1fRydAYLqA\nYDTdxiMECOQvkL4MNk2vM7Uu/1rpIYG5BASkubhsTIDAEgQEoyUgOgQBAr0KpKl1D4+Wrlpn\nal2vpXByAssXEJCWb+qIBAhMFhCMJrtYS4BAeQK/H13+dDRT68qrnR4T2FRAQNqUyAYECCwo\nIBgtCGh3AgSyErhB9Oa60Z6YVa90hgCBpQkISEujdCACBPYSEIz2AnGXAIHiBX4pRvCwaP8r\nmql1gWAhUKOAgFRjVY2JQL8CglG//s5OgEB7AumqdWlq3b+2dwpHJkCgbwEBqe8KOD+BegQE\no3pqaSQECOwrMJpa56p1+9pYQ6AqAQGpqnIaDIFeBASjXtidlACBDgXS1Lp01bqXRPtZh+d1\nKgIEehAQkHpAd0oClQgIRpUU0jAIENhUIF217pPRTK3blMoGBMoXEJDKr6EREOhaQDDqWtz5\nCBDoUyBNrbtONFPr+qyCcxPoUEBA6hDbqQgULiAYFV5A3SdAYG6Bg2KPh0VLV60ztS4QLASG\nICAgDaHKxkhgMQHBaDE/exMgUK5Aumrdp6KZWlduDfWcwNwCAtLcZHYgMBgBwWgwpTZQAgQm\nCJhaNwHFKgJDEBCQhlBlYyQwn4BgNJ+XrQkQqE8gTa1LV617cTRT6+qrrxER2FBAQNqQx4ME\nBiUgGA2q3AZLgMAGAumqdZ+I9oUNtvEQAQKVCghIlRbWsAjMISAYzYFlUwIEqhe4YYzw2tGe\nWP1IDZAAgYkCAtJEFisJDEJAMBpEmQ2SAIE5BEZXrTO1bg40mxKoTUBAqq2ixkNgcwHBaHMj\nWxAgMEwBU+uGWXejJrBOQEBax+EOgaoFBKOqy2twBAgsKGBq3YKAdidQi4CAVEsljYPAdAHB\naLqNRwgQIJAERlet+8u47ap1nhMEBi4gIA38CWD4VQsIRlWX1+AIEFiiwIPiWB+L9sUlHtOh\nCBAoVEBAKrRwuk1gAwHBaAMcDxEgQGAvgRvF/WtFc9W6vWDcJTBUAQFpqJU37hoFBKMaq2pM\nBAi0KTC6ap2pdW0qOzaBwgQEpMIKprsEJggIRhNQrCJAgMAMAqbWzYBkEwJDExCQhlZx461J\nQDCqqZrGQoBA1wI3jhNeM9qTuj6x8xEgkLeAgJR3ffSOwCQBwWiSinUECBCYXSBNrXtotBdG\nc9W62d1sSWAQAgLSIMpskJUICEaVFNIwCBDoXeDB0YOPRvtS7z3RAQIEshMQkLIriQ4R2EdA\nMNqHxAoCBAhsWWA0te7kLR/BjgQIVC0gIFVdXoMrXEAwKryAuk+AQHYCo6vW/UX07OfZ9U6H\nCBDIQkBAyqIMOkFgnYBgtI7DHQIECCxNIE2t+0i0Ly/tiA5EgEB1AgJSdSU1oIIFBKOCi6fr\nBAhkL5Cm1h0ezRfCZl8qHSTQr4CA1K+/sxNIAoKR5wEBAgTaFTg4Dp+uWveCaKbWtWvt6ASK\nFxCQii+hARQsIBgVXDxdJ0CgKAFT64oql84S6FdAQOrX39mHKSAYDbPuRk2AQD8C6f+5h0Xz\nhbD9+DsrgeIEBKTiSqbDBQsIRgUXT9cJEChSYDS1zlXriiyfThPoR0BA6sfdWYclIBgNq95G\nS4BAPgJpat2Ho7lqXT410RMC2QsISNmXSAcLFhCMCi6erhMgULzATWIEptYVX0YDINC9gIDU\nvbkz1i8gGNVfYyMkQCBvgTS17g+iuWpd3nXSOwJZCghIWZZFpwoVEIwKLZxuEyBQnUAKR6bW\nVVdWAyLQjYCA1I2zs9QtIBjVXV+jI0CgLIE0te7QaC8rq9t6S4BALgICUi6V0I8SBQSjEqum\nzwQI1CyQptY9JNrzo/lC2JorbWwEWhQQkFrEdehqBQSjaktrYAQIFC6QwtEHo32l8HHoPgEC\nPQoISD3iO3VxAoJRcSXTYQIEBiRw0xjrr0d76YDGbKgECLQgICC1gOqQ1QkIRtWV1IAIEKhM\nYDS17s9jXKbWVVZcwyHQtYCA1LW485UkIBiVVC19JUBgyAJpat3Z0UytG/KzwNgJLElAQFoS\npMNUJSAYVVVOgyFAoHIBU+sqL7DhEehaQEDqWtz5chYQjHKujr4RIEBgX4FDYlX6ziNXrdvX\nxhoCBLYoICBtEc5uVQkIRlWV02AIEBiQQApHrlo3oIIbKoEuBASkLpSdI1cBwSjXyugXAQIE\nNhcwtW5zI1sQILAFAQFpC2h2KV5AMCq+hAZAgMDABdLUunRhhj+L5qp1A38yGD6BZQsISMsW\ndbycBQSjnKujbwQIEJhdIIWjD0Q7Z/ZdbEmAAIHZBASk2ZxsVbaAYFR2/fSeAAEC4wI3iztX\ni/aS8ZVuEyBAYFkCAtKyJB0nRwHBKMeq6BMBAgS2LnDp2DVdmOHMaL/Y+mHsSYAAgekCAtJ0\nG4+UKyAYlVs7PSdAgMBGAikcpal1X91oI48RIEBgEQEBaRE9++YmIBjlVhH9IUCAwPIEbh6H\numo0U+uWZ+pIBAhMEBCQJqBYVZyAYFRcyXSYAAECcwmkqXUPjmZq3VxsNiZAYCsCAtJW1OyT\ni4BglEsl9IMAAQLtCqSr1r0/mql17To7OgECISAgeRqUKCAYlVg1fSZAgMDWBG4Ru10l2ou3\ntru9CBAgMJ+AgDSfl637FRCM+vV3dgIECHQtkKbWPSiaqXVdyzsfgQELCEgDLn5BQxeMCiqW\nrhIgQGCJAg+NY70vmql1S0R1KAIENhYQkDb28Wi/AoJRv/7OToAAgT4F0tS6K0d7UZ+dcG4C\nBIYnICANr+YljFgwKqFK+kiAAIH2BEZT654Xp/CFsO05OzIBAhMEBKQJKFb1JiAY9UbvxAQI\nEMhKYDS17mtZ9UpnCBAYhMDKIEZpkLkLCEa5V0j/CBAg0J3AleJU14x2drTV7k7rTAQIEFgT\nEJA8E/oUEIz61HduAgQIECBAgACBfQQEpH1IrOhAQDDqANkpCBAgQIAAAQIECBDIX+CG0cVT\no6WQZCFAgAABAgQIECBAgAABAgQIECBAgAABAgRyFDDFLseq6BMBAgQIEChO4IS4uML+/zOu\nq3DJ6PrPmubiNzbNft9smpWHxf2rxM90wYWzmuZZb46fsTzh5k1ziQfGjbii7urrmub0DzfN\nsbHvL983jvP7sf7cptnxslj/0bS1hQABAl0JCEhdSWd/nhOPjBev+6x1c/U78TNelFauET/j\nhaq5VNz+YbzYvaVpzvjk2jYnxiVYV24Z7fym+fmZTfOc89bWH3eZpjnosU3z4xc0zQt+sLbO\nvwQIECAwDIETbhyh6OPx2vCOCEL3WBvziSkcfS5ux2vLaddb77A9Xj9Wvxsh6Jlr6086Nu7H\n68hKHGP1j2Pd7Zrmgl9rmhf+eP1+7hEgQKA9gf3aO7QjlyVw+rujv9+PF6VHR/t6vFh9O17I\nPhAvUJ9ZW7f68z3hKI3s9FfHP0dFOIq/+o3C0Qm3bZqDXx/bn9o0B0SoshAgQIDAsATO+HSM\n9+3Rjmia4w9eG3t6PWnSa8Z1Yt1ha+t2/3to3PqLtXuPu2z8IS7eLTrt2RGuzorbj4r1BzXN\npW+y9rh/CRAg0I2AgNSNcyFnWY0XpdULo7N329Phr74mbn8rQs9d9qxLt/7oV+KfT0U4+vye\n9We8N6ZDxF8DLQQIECAwXIEdL42xHxLvJN1/j8GOf4jXkZWYTffwPevSlLyVeGfo9AvW1j0v\nZiTsDFijTdL3IX2jac45e7TCTwIECHQhICB1oVzMOXa+SMWc8eZeEYAut9btN18coSm9uxRz\nxbffYM9QDn5AvLC9Yc/90a39YnsLAQIECAxX4Ix3xdjTH9Yescdgv7vGa8nX4v7Dou363WP/\nB8W7RBNeR2KL5qRD43NIz419Htw06XXIQoAAge4EBKTurEs50yviRe3AmCoXL1xpOXr/+Cem\nRaymD9eOvdit3LNpLvzbtIWFAAECBAiMCeyI14zXxmvJ7zTN46/dNMfFa8rK4fH4n0a7Wvyx\n7ci1bVcjNP3onWP77bp5QkypW31Z7PNbkaXiD3THx08LAQIEuhMQkLqzLuTKdom2AAASEElE\nQVRM6XNHzTnxwrQrDB16p7j/1rj/nvgZf8k7+oCmOfE34sXrm03z/IsKGZRuEiBAgEC3Aq9a\nO90B8VpyUMxK2PF38boR7xaladwrj4yQdKP4+a9N85Kf79utdDGg0+4Q69Oshe/FO0kpWFkI\nECDQmYCA1Bl1USd6dbxwxYvXSTeNF6aHxAva62IaRMwpX7l80xwW7xytxLr94q+DFgIECBAg\nMEngtM/Fa0cEnSZeL1biqqdpKt3OadzpEt8RmJrHN80v0mdcN1ieFX+sa14S7eobbOQhAgQI\nLF1AQFo6aQ0HvDgC0s7lhPg3rkKUrkD007gq0er5EYyOiXW3iSsMxQUZLAQIECBAYKpA+kPa\nleO1Iz5DNLraafoKiSZmIqTpd8/+0NQ9dz+wI95Baj6x+64bBAgQ6EBAQOoAubxTnPGV6PNH\nosWFGHa8aa3/aTrdSvpr31HRPhgtfSbJQoAAAQIEpgisxutH+vzqyuiPbrHdzq+P+FKsf8u+\nOx17UNOccOtYH1e7S0v6DOzK3eN16E/W7vuXAAEC3QhcopvTOEt5Aquvjz5fP16Y3ran7xe/\nIqbcPSZe2NIUiSnL6oFrr20rwvcUIasJECAwDIHT/y0+axQX8zn3b9aPd+V18doy4SI/l7li\nvMb8VUzvTpf9TldU/eX4rr2T492nCFQWAgQIECDQu8BjY1rESa/ctxsnxQdtpy3pi2LT4yfF\nXwzTt6M/4XrTtrSeAAECBIYgcNxl9h3liZfed934mvSFsRYCBAgQIJClwKQXtknrsuy8ThEg\nQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAg\nQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAg\nQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAg\nQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAg\nQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAg\nQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAg\nQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAg\nQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAg\nQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAg\nQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAg\nQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAg\nQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAg\nQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAg\nQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAg\nQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAg\nQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAg\nQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAg\nQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAg\nQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAg\nQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAg\nQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAg\nQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAg\nQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAg\nQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAg\nQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAg\nQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAg\nQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAg\nQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAg\nQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAg\nQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAg\nQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAg\nQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAg\nQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAg\nQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAg\nQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAg\nQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAg\nQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAg\nQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAg\nQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAg\nQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAg\nQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAg\nQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAg\nQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAg\nQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAg\nQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAg\nQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAg\nQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAg\nQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAg\nQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAg\nQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAg\nQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAg\nQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAg\nQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAg\nQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAg\nQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAg\nQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAg\nQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAg\nQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAg\nQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAg\nQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAg\nQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAg\nQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAg\nQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAg\nQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAg\nQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAg\nQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAg\nQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAg\nQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAg\nQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAg\nQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAg\nQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAg\nQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAg\nQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAg\nQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAg\nQIAAAQIECBAgQIAAAQIECBAgQIAAAQIECBAgkL3A/wcrgm8t2Wyk7wAAAABJRU5ErkJggg==", "text/plain": [ "Plot with title “”" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "plotDAG(D, xjitter = 0.3, yjitter = 0.04, edge_attrs = list(width = 0.5, arrow.width = 0.4, arrow.size = 0.8), vertex_attrs = list(size = 12, label.cex = 0.8))" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Now that we've taken a look at the structure of the data-generating process (the DAG), let's generate some data and take a quick look at the data set:" ] }, { "cell_type": "code", "execution_count": 5, "metadata": {}, "outputs": [ { "name": "stderr", "output_type": "stream", "text": [ "simulating observed dataset from the DAG object\n" ] }, { "data": { "text/html": [ "\n", "\n", "\n", "\t\n", "\t\n", "\t\n", "\t\n", "\t\n", "\t\n", "\n", "
IDW1W2W3A
1 0 0 0 0.264097302
2 1 0 1 1.947282783
3 1 0 0 0.343522286
4 0 0 0 -1.342266575
5 0 0 0 1.446429830
6 0 0 0 -0.001315845
\n" ], "text/latex": [ "\\begin{tabular}{r|lllll}\n", " ID & W1 & W2 & W3 & A\\\\\n", "\\hline\n", "\t 1 & 0 & 0 & 0 & 0.264097302\\\\\n", "\t 2 & 1 & 0 & 1 & 1.947282783\\\\\n", "\t 3 & 1 & 0 & 0 & 0.343522286\\\\\n", "\t 4 & 0 & 0 & 0 & -1.342266575\\\\\n", "\t 5 & 0 & 0 & 0 & 1.446429830\\\\\n", "\t 6 & 0 & 0 & 0 & -0.001315845\\\\\n", "\\end{tabular}\n" ], "text/markdown": [ "\n", "ID | W1 | W2 | W3 | A | \n", "|---|---|---|---|---|---|\n", "| 1 | 0 | 0 | 0 | 0.264097302 | \n", "| 2 | 1 | 0 | 1 | 1.947282783 | \n", "| 3 | 1 | 0 | 0 | 0.343522286 | \n", "| 4 | 0 | 0 | 0 | -1.342266575 | \n", "| 5 | 0 | 0 | 0 | 1.446429830 | \n", "| 6 | 0 | 0 | 0 | -0.001315845 | \n", "\n", "\n" ], "text/plain": [ " ID W1 W2 W3 A \n", "1 1 0 0 0 0.264097302\n", "2 2 1 0 1 1.947282783\n", "3 3 1 0 0 0.343522286\n", "4 4 0 0 0 -1.342266575\n", "5 5 0 0 0 1.446429830\n", "6 6 0 0 0 -0.001315845" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "data_O <- as.data.table(sim(D, n = 10000, rndseed = 57192))\n", "head(data_O)" ] }, { "cell_type": "code", "execution_count": 6, "metadata": {}, "outputs": [], "source": [ "newdata <- data_O[sample(seq_len(1000)), c(\"W1\", \"W2\", \"W3\", \"A\"), with = FALSE]" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Now that we've generate some data, we will generate an `Task` object for use with `sl3`:" ] }, { "cell_type": "code", "execution_count": 7, "metadata": { "collapsed": true }, "outputs": [ { "data": { "text/plain": [ "A sl3 Task with 10000 obs and these nodes:\n", "$covariates\n", "[1] \"W1\" \"W2\" \"W3\"\n", "\n", "$outcome\n", "[1] \"A\"\n", "\n", "$id\n", "NULL\n", "\n", "$weights\n", "NULL\n", "\n", "$offset\n", "NULL\n" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "task <- sl3_Task$new(data_O, covariates = c(\"W1\", \"W2\", \"W3\"),\n", " outcome = \"A\")\n", "task" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "We'll do this same conversion for the sub-sampled testing data as well:" ] }, { "cell_type": "code", "execution_count": 8, "metadata": {}, "outputs": [], "source": [ "new_task <- sl3_Task$new(newdata, covariates = c(\"W1\", \"W2\", \"W3\"),\n", " outcome = \"A\")" ] }, { "cell_type": "code", "execution_count": 13, "metadata": {}, "outputs": [ { "data": { "text/html": [ "
    \n", "\t
  1. 'binomial'
  2. \n", "\t
  3. 'categorical'
  4. \n", "\t
  5. 'continuous'
  6. \n", "\t
  7. 'density'
  8. \n", "\t
  9. 'ids'
  10. \n", "\t
  11. 'multivariate_outcome'
  12. \n", "\t
  13. 'offset'
  14. \n", "\t
  15. 'preprocessing'
  16. \n", "\t
  17. 'timeseries'
  18. \n", "\t
  19. 'weights'
  20. \n", "\t
  21. 'wrapper'
  22. \n", "
\n" ], "text/latex": [ "\\begin{enumerate*}\n", "\\item 'binomial'\n", "\\item 'categorical'\n", "\\item 'continuous'\n", "\\item 'density'\n", "\\item 'ids'\n", "\\item 'multivariate\\_outcome'\n", "\\item 'offset'\n", "\\item 'preprocessing'\n", "\\item 'timeseries'\n", "\\item 'weights'\n", "\\item 'wrapper'\n", "\\end{enumerate*}\n" ], "text/markdown": [ "1. 'binomial'\n", "2. 'categorical'\n", "3. 'continuous'\n", "4. 'density'\n", "5. 'ids'\n", "6. 'multivariate_outcome'\n", "7. 'offset'\n", "8. 'preprocessing'\n", "9. 'timeseries'\n", "10. 'weights'\n", "11. 'wrapper'\n", "\n", "\n" ], "text/plain": [ " [1] \"binomial\" \"categorical\" \"continuous\" \n", " [4] \"density\" \"ids\" \"multivariate_outcome\"\n", " [7] \"offset\" \"preprocessing\" \"timeseries\" \n", "[10] \"weights\" \"wrapper\" " ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "sl3::sl3_list_properties()" ] }, { "cell_type": "code", "execution_count": 14, "metadata": {}, "outputs": [], "source": [ "lrn <- Lrnr_condensier$new(nbins = 10, bin_method = \"equal.len\", \n", " bin_estimator = Lrnr_xgboost$new(nrounds = 50, objective = \"reg:logistic\"))" ] }, { "cell_type": "code", "execution_count": 15, "metadata": {}, "outputs": [ { "data": { "text/html": [ "\n", "\n", "\n", "\t\n", "\t\n", "\t\n", "\t\n", "\t\n", "\t\n", "\n", "
likelihood
0.1054223
0.3310383
0.3310383
0.1044063
0.1429661
0.1044063
\n" ], "text/latex": [ "\\begin{tabular}{r|l}\n", " likelihood\\\\\n", "\\hline\n", "\t 0.1054223\\\\\n", "\t 0.3310383\\\\\n", "\t 0.3310383\\\\\n", "\t 0.1044063\\\\\n", "\t 0.1429661\\\\\n", "\t 0.1044063\\\\\n", "\\end{tabular}\n" ], "text/markdown": [ "\n", "likelihood | \n", "|---|---|---|---|---|---|\n", "| 0.1054223 | \n", "| 0.3310383 | \n", "| 0.3310383 | \n", "| 0.1044063 | \n", "| 0.1429661 | \n", "| 0.1044063 | \n", "\n", "\n" ], "text/plain": [ " likelihood\n", "1 0.1054223 \n", "2 0.3310383 \n", "3 0.3310383 \n", "4 0.1044063 \n", "5 0.1429661 \n", "6 0.1044063 " ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "trained_lrn = lrn$train(task)\n", "pred_probs = trained_lrn$predict(new_task)\n", "head(pred_probs)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Next, ..." ] }, { "cell_type": "code", "execution_count": 16, "metadata": {}, "outputs": [], "source": [ "lrn1 <- Lrnr_condensier$new(nbins = 25, bin_method = \"equal.len\", \n", " bin_estimator = Lrnr_glm_fast$new())\n", "lrn2 <- Lrnr_condensier$new(nbins = 35, bin_method = \"equal.len\",\n", " bin_estimator = Lrnr_xgboost$new(nrounds = 50, objective = \"reg:logistic\"))" ] }, { "cell_type": "code", "execution_count": 19, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "\n", "Iter: 1 fn: 17575.2858\t Pars: 0.38205 0.61795\n", "Iter: 2 fn: 17575.2858\t Pars: 0.38205 0.61795\n", "solnp--> Completed in 2 iterations\n", "\n", "density meta-learner fit:\n", "Lrnr_condensier_equal.len_25_20_FALSE_NA_FALSE_NULL \n", " 0.3820534 \n", "Lrnr_condensier_equal.len_35_20_FALSE_NA_FALSE_NULL \n", " 0.6179466 \n" ] }, { "data": { "text/html": [ "\n", "\n", "\n", "\t\n", "\t\n", "\t\n", "\t\n", "\t\n", "\t\n", "\n", "
likelihood
0.15692777
0.32778128
0.29269600
0.13861217
0.21158060
0.04837965
\n" ], "text/latex": [ "\\begin{tabular}{r|l}\n", " likelihood\\\\\n", "\\hline\n", "\t 0.15692777\\\\\n", "\t 0.32778128\\\\\n", "\t 0.29269600\\\\\n", "\t 0.13861217\\\\\n", "\t 0.21158060\\\\\n", "\t 0.04837965\\\\\n", "\\end{tabular}\n" ], "text/markdown": [ "\n", "likelihood | \n", "|---|---|---|---|---|---|\n", "| 0.15692777 | \n", "| 0.32778128 | \n", "| 0.29269600 | \n", "| 0.13861217 | \n", "| 0.21158060 | \n", "| 0.04837965 | \n", "\n", "\n" ], "text/plain": [ " likelihood\n", "1 0.15692777\n", "2 0.32778128\n", "3 0.29269600\n", "4 0.13861217\n", "5 0.21158060\n", "6 0.04837965" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "sl <- Lrnr_sl$new(learners = list(lrn1, lrn2),\n", " metalearner = Lrnr_solnp_density$new())\n", "sl_fit <- sl$train(task)\n", "head(sl_fit$predict(new_task))" ] }, { "cell_type": "code", "execution_count": 18, "metadata": {}, "outputs": [ { "data": { "text/html": [ "\n", "
Lrnr_solnp_density {sl3}R Documentation
\n", "\n", "

Nonlinear Optimization via Augmented Lagrange

\n", "\n", "

Description

\n", "\n", "

This meta-learner provides fitting procedures for density estimation, finding\n", "convex combinations of candidate density estimators by minimizing the\n", "cross-validated negative log-likelihood loss of each candidate density. The\n", "optimization problem is solved by making use of solnp,\n", "using Lagrange multipliers. For further details, consult the documentation of\n", "the Rsolnp package.\n", "

\n", "\n", "\n", "

Usage

\n", "\n", "
\n",
       "Lrnr_solnp_density\n",
       "
\n", "\n", "\n", "

Format

\n", "\n", "

R6Class object.

\n", "\n", "\n", "

Value

\n", "\n", "

Learner object with methods for training and prediction. See\n", "Lrnr_base for documentation on learners.\n", "

\n", "\n", "\n", "

Parameters

\n", "\n", "\n", "
\n", "
...

Not currently used.

\n", "
\n", "
\n", "\n", "\n", "\n", "

Common Parameters

\n", "\n", "

Individual learners have their own sets of parameters. Below is a list of shared parameters, implemented by Lrnr_base, and shared\n", "by all learners.\n", "

\n", "\n", "
\n", "
covariates

A character vector of covariates. The learner will use this to subset the covariates for any specified task

\n", "
\n", "
outcome_type

A variable_type object used to control the outcome_type used by the learner. Overrides the task outcome_type if specified

\n", "
\n", "
...

All other parameters should be handled by the invidual learner classes. See the documentation for the learner class you're instantiating

\n", "
\n", "
\n", "\n", "\n", "\n", "

See Also

\n", "\n", "

Other Learners: Custom_chain,\n", "Lrnr_HarmonicReg, Lrnr_arima,\n", "Lrnr_base, Lrnr_bilstm,\n", "Lrnr_condensier, Lrnr_cv,\n", "Lrnr_define_interactions,\n", "Lrnr_expSmooth,\n", "Lrnr_glm_fast, Lrnr_glmnet,\n", "Lrnr_glm, Lrnr_h2o_grid,\n", "Lrnr_independent_binomial,\n", "Lrnr_lstm, Lrnr_mean,\n", "Lrnr_nnls, Lrnr_optim,\n", "Lrnr_pkg_SuperLearner,\n", "Lrnr_randomForest,\n", "Lrnr_rugarch, Lrnr_sl,\n", "Lrnr_solnp,\n", "Lrnr_subset_covariates,\n", "Lrnr_tsDyn, Lrnr_xgboost,\n", "Pipeline, Stack,\n", "define_h2o_X,\n", "undocumented_learner\n", "

\n", "\n", "
[Package sl3 version 1.0.0 ]
" ], "text/latex": [ "\\inputencoding{utf8}\n", "\\HeaderA{Lrnr\\_solnp\\_density}{Nonlinear Optimization via Augmented Lagrange}{Lrnr.Rul.solnp.Rul.density}\n", "\\keyword{data}{Lrnr\\_solnp\\_density}\n", "%\n", "\\begin{Description}\\relax\n", "This meta-learner provides fitting procedures for density estimation, finding\n", "convex combinations of candidate density estimators by minimizing the\n", "cross-validated negative log-likelihood loss of each candidate density. The\n", "optimization problem is solved by making use of \\code{\\LinkA{solnp}{solnp}},\n", "using Lagrange multipliers. For further details, consult the documentation of\n", "the \\code{Rsolnp} package.\n", "\\end{Description}\n", "%\n", "\\begin{Usage}\n", "\\begin{verbatim}\n", "Lrnr_solnp_density\n", "\\end{verbatim}\n", "\\end{Usage}\n", "%\n", "\\begin{Format}\n", "\\code{\\LinkA{R6Class}{R6Class}} object.\n", "\\end{Format}\n", "%\n", "\\begin{Value}\n", "Learner object with methods for training and prediction. See\n", "\\code{\\LinkA{Lrnr\\_base}{Lrnr.Rul.base}} for documentation on learners.\n", "\\end{Value}\n", "%\n", "\\begin{Section}{Parameters}\n", "\n", "\\begin{description}\n", "\n", "\\item[\\code{...}] Not currently used.\n", "\n", "\\end{description}\n", "\n", "\\end{Section}\n", "%\n", "\\begin{Section}{Common Parameters}\n", "\n", "\n", "Individual learners have their own sets of parameters. Below is a list of shared parameters, implemented by \\code{Lrnr\\_base}, and shared\n", "by all learners.\n", "\n", "\\begin{description}\n", "\n", "\\item[\\code{covariates}] A character vector of covariates. The learner will use this to subset the covariates for any specified task\n", "\\item[\\code{outcome\\_type}] A \\code{\\LinkA{variable\\_type}{variable.Rul.type}} object used to control the outcome\\_type used by the learner. Overrides the task outcome\\_type if specified\n", "\\item[\\code{...}] All other parameters should be handled by the invidual learner classes. See the documentation for the learner class you're instantiating\n", "\n", "\\end{description}\n", "\n", "\\end{Section}\n", "%\n", "\\begin{SeeAlso}\\relax\n", "Other Learners: \\code{\\LinkA{Custom\\_chain}{Custom.Rul.chain}},\n", "\\code{\\LinkA{Lrnr\\_HarmonicReg}{Lrnr.Rul.HarmonicReg}}, \\code{\\LinkA{Lrnr\\_arima}{Lrnr.Rul.arima}},\n", "\\code{\\LinkA{Lrnr\\_base}{Lrnr.Rul.base}}, \\code{\\LinkA{Lrnr\\_bilstm}{Lrnr.Rul.bilstm}},\n", "\\code{\\LinkA{Lrnr\\_condensier}{Lrnr.Rul.condensier}}, \\code{\\LinkA{Lrnr\\_cv}{Lrnr.Rul.cv}},\n", "\\code{\\LinkA{Lrnr\\_define\\_interactions}{Lrnr.Rul.define.Rul.interactions}},\n", "\\code{\\LinkA{Lrnr\\_expSmooth}{Lrnr.Rul.expSmooth}},\n", "\\code{\\LinkA{Lrnr\\_glm\\_fast}{Lrnr.Rul.glm.Rul.fast}}, \\code{\\LinkA{Lrnr\\_glmnet}{Lrnr.Rul.glmnet}},\n", "\\code{\\LinkA{Lrnr\\_glm}{Lrnr.Rul.glm}}, \\code{\\LinkA{Lrnr\\_h2o\\_grid}{Lrnr.Rul.h2o.Rul.grid}},\n", "\\code{\\LinkA{Lrnr\\_independent\\_binomial}{Lrnr.Rul.independent.Rul.binomial}},\n", "\\code{\\LinkA{Lrnr\\_lstm}{Lrnr.Rul.lstm}}, \\code{\\LinkA{Lrnr\\_mean}{Lrnr.Rul.mean}},\n", "\\code{\\LinkA{Lrnr\\_nnls}{Lrnr.Rul.nnls}}, \\code{\\LinkA{Lrnr\\_optim}{Lrnr.Rul.optim}},\n", "\\code{\\LinkA{Lrnr\\_pkg\\_SuperLearner}{Lrnr.Rul.pkg.Rul.SuperLearner}},\n", "\\code{\\LinkA{Lrnr\\_randomForest}{Lrnr.Rul.randomForest}},\n", "\\code{\\LinkA{Lrnr\\_rugarch}{Lrnr.Rul.rugarch}}, \\code{\\LinkA{Lrnr\\_sl}{Lrnr.Rul.sl}},\n", "\\code{\\LinkA{Lrnr\\_solnp}{Lrnr.Rul.solnp}},\n", "\\code{\\LinkA{Lrnr\\_subset\\_covariates}{Lrnr.Rul.subset.Rul.covariates}},\n", "\\code{\\LinkA{Lrnr\\_tsDyn}{Lrnr.Rul.tsDyn}}, \\code{\\LinkA{Lrnr\\_xgboost}{Lrnr.Rul.xgboost}},\n", "\\code{\\LinkA{Pipeline}{Pipeline}}, \\code{\\LinkA{Stack}{Stack}},\n", "\\code{\\LinkA{define\\_h2o\\_X}{define.Rul.h2o.Rul.X}},\n", "\\code{\\LinkA{undocumented\\_learner}{undocumented.Rul.learner}}\n", "\\end{SeeAlso}" ], "text/plain": [ "Lrnr_solnp_density package:sl3 R Documentation\n", "\n", "_\bN_\bo_\bn_\bl_\bi_\bn_\be_\ba_\br _\bO_\bp_\bt_\bi_\bm_\bi_\bz_\ba_\bt_\bi_\bo_\bn _\bv_\bi_\ba _\bA_\bu_\bg_\bm_\be_\bn_\bt_\be_\bd _\bL_\ba_\bg_\br_\ba_\bn_\bg_\be\n", "\n", "_\bD_\be_\bs_\bc_\br_\bi_\bp_\bt_\bi_\bo_\bn:\n", "\n", " This meta-learner provides fitting procedures for density\n", " estimation, finding convex combinations of candidate density\n", " estimators by minimizing the cross-validated negative\n", " log-likelihood loss of each candidate density. The optimization\n", " problem is solved by making use of ‘solnp’, using Lagrange\n", " multipliers. For further details, consult the documentation of the\n", " ‘Rsolnp’ package.\n", "\n", "_\bU_\bs_\ba_\bg_\be:\n", "\n", " Lrnr_solnp_density\n", " \n", "_\bF_\bo_\br_\bm_\ba_\bt:\n", "\n", " ‘R6Class’ object.\n", "\n", "_\bV_\ba_\bl_\bu_\be:\n", "\n", " Learner object with methods for training and prediction. See\n", " ‘Lrnr_base’ for documentation on learners.\n", "\n", "_\bP_\ba_\br_\ba_\bm_\be_\bt_\be_\br_\bs:\n", "\n", " ‘...’ Not currently used.\n", "\n", "_\bC_\bo_\bm_\bm_\bo_\bn _\bP_\ba_\br_\ba_\bm_\be_\bt_\be_\br_\bs:\n", "\n", " Individual learners have their own sets of parameters. Below is a\n", " list of shared parameters, implemented by ‘Lrnr_base’, and shared\n", " by all learners.\n", "\n", " ‘covariates’ A character vector of covariates. The learner will\n", " use this to subset the covariates for any specified task\n", "\n", " ‘outcome_type’ A ‘variable_type’ object used to control the\n", " outcome_type used by the learner. Overrides the task\n", " outcome_type if specified\n", "\n", " ‘...’ All other parameters should be handled by the invidual\n", " learner classes. See the documentation for the learner class\n", " you're instantiating\n", "\n", "_\bS_\be_\be _\bA_\bl_\bs_\bo:\n", "\n", " Other Learners: ‘Custom_chain’, ‘Lrnr_HarmonicReg’, ‘Lrnr_arima’,\n", " ‘Lrnr_base’, ‘Lrnr_bilstm’, ‘Lrnr_condensier’, ‘Lrnr_cv’,\n", " ‘Lrnr_define_interactions’, ‘Lrnr_expSmooth’, ‘Lrnr_glm_fast’,\n", " ‘Lrnr_glmnet’, ‘Lrnr_glm’, ‘Lrnr_h2o_grid’,\n", " ‘Lrnr_independent_binomial’, ‘Lrnr_lstm’, ‘Lrnr_mean’,\n", " ‘Lrnr_nnls’, ‘Lrnr_optim’, ‘Lrnr_pkg_SuperLearner’,\n", " ‘Lrnr_randomForest’, ‘Lrnr_rugarch’, ‘Lrnr_sl’, ‘Lrnr_solnp’,\n", " ‘Lrnr_subset_covariates’, ‘Lrnr_tsDyn’, ‘Lrnr_xgboost’,\n", " ‘Pipeline’, ‘Stack’, ‘define_h2o_X’, ‘undocumented_learner’\n" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "?Lrnr_solnp_density" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Nesting the Super Learner for bin hazards with density Super Learner" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Note that bin_estimator can be also a Super-Learner object from sl3. In this case the bin hazard will be estimated by stacking several candidate estimators. For example, below, we define a single density learner lrn, with the hazard estimator defined by the Super-Learner that stacks two candidates (GLM and xgboost GBM). Note that in contrast to the above example, this Super-Learner fit will be optimized for the logistic regression problem (estimating pooled bin hazards), but still using internal 10-fold cross-validation." ] }, { "cell_type": "code", "execution_count": 20, "metadata": {}, "outputs": [ { "data": { "text/html": [ "\n", "\n", "\n", "\t\n", "\t\n", "\t\n", "\t\n", "\t\n", "\t\n", "\n", "
likelihood
0.10960448
0.30398384
0.30398384
0.14400828
0.20825493
0.06986179
\n" ], "text/latex": [ "\\begin{tabular}{r|l}\n", " likelihood\\\\\n", "\\hline\n", "\t 0.10960448\\\\\n", "\t 0.30398384\\\\\n", "\t 0.30398384\\\\\n", "\t 0.14400828\\\\\n", "\t 0.20825493\\\\\n", "\t 0.06986179\\\\\n", "\\end{tabular}\n" ], "text/markdown": [ "\n", "likelihood | \n", "|---|---|---|---|---|---|\n", "| 0.10960448 | \n", "| 0.30398384 | \n", "| 0.30398384 | \n", "| 0.14400828 | \n", "| 0.20825493 | \n", "| 0.06986179 | \n", "\n", "\n" ], "text/plain": [ " likelihood\n", "1 0.10960448\n", "2 0.30398384\n", "3 0.30398384\n", "4 0.14400828\n", "5 0.20825493\n", "6 0.06986179" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "lrn <- Lrnr_condensier$new(nbins = 35, bin_method = \"equal.len\", bin_estimator = \n", " Lrnr_sl$new(\n", " learners = list(\n", " Lrnr_glm_fast$new(family = \"binomial\"),\n", " Lrnr_xgboost$new(nrounds = 50, objective = \"reg:logistic\")\n", " ),\n", " metalearner = Lrnr_glm$new()\n", " ))\n", "binSL_fit <- lrn$train(task)\n", "head(binSL_fit$predict(new_task))" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Homwork #3: Using and contributing to `sl3`" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "What learners to contribute: https://github.com/tlverse/sl3/issues/114\n", "How do learners work: https://sl3.tlverse.org/articles/custom_lrnrs.html\n", "\n", "Tips and best practices:\n", "* For the `sl3` repo from https://github.com/tlverse/sl3\n", "* Using git, create a new branch for your proposed contribution (e.g., `Lrnr_ranger`)\n", "* Generate a template for your new learner via `sl3::write_learner_template(here::here(\"R\", \"Lrnr_ranger.R\"))`\n", "* Fill out the template based on the properties of your new learner. Try looking at already written learners to get a better idea of how to fill out the various methods slots.\n", "* Write a set of unit tests under `tests/testthat/test_Lrnr_ranger.R` that ensure that your new learner works as intended/expected.\n", "* For a guide on how to write unit tests properly, try looking at the numerous unit tests that already exist inside the `sl3` package.\n", "* For background on unit testing in R, take a look here: http://r-pkgs.had.co.nz/tests.html\n", "* The `tlverse` ecosystem uses the [\"`Tidyverse` Style Guide\"](http://style.tidyverse.org/). Make sure to follow the formatting guidelines for your code and for pull requests. You can use the [`styler` R package](https://cran.r-project.org/web/packages/styler/index.html) to automatically reformat your code." ] } ], "metadata": { "kernelspec": { "display_name": "R", "language": "R", "name": "ir" }, "language_info": { "codemirror_mode": "r", "file_extension": ".r", "mimetype": "text/x-r-source", "name": "R", "pygments_lexer": "r", "version": "3.4.3" } }, "nbformat": 4, "nbformat_minor": 2 }