{ "cells": [ { "cell_type": "markdown", "metadata": {}, "source": [ "# Lab 1 Extra: Uncertainty Propagation\n", "### Gaussian Process Summer School 2019\n", "\n", "This lab is an extension on the work introduced in Lab 1 of the summer school. It is more advanced, and you should make sure you've completed Lab 1 before attempting. It is designed to demonstrate the advantage of using models when we have only a small number of observations of a latent function $f$.\n", "\n", "Extra labs are for you to explore in your own time, giving details of other uses of Gaussian processes not covered in the summer school. Answers for extra labs will be made available _after_ the summer school." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "# Support for maths\n", "import numpy as np\n", "# Plotting tools\n", "from matplotlib import pyplot as plt\n", "# we use the following for plotting figures in jupyter\n", "%matplotlib inline\n", "\n", "import warnings\n", "warnings.filterwarnings('ignore')\n", "\n", "# GPy: Gaussian processes library\n", "import GPy" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## The Branin Function\n", "\n", "For this lab we will focus on the [Branin-Hoo function](https://www.sfu.ca/~ssurjano/branin.html) as our latent function. The Branin-Hoo function in two-dimensions is typically used as an _artificial landscape_, i.e. a test function for optimisation. We will the most common version of the function\n", "\n", "$$\n", " f(\\mathbf{x}) = a(x_2 - bx_1^2 + cx_1 - r)^2 + s(1-t)\\cos(x_1) + s,\n", "$$\n", "\n", "with paremeters defined:\n", "$$\n", " a = 1, \\quad b = \\frac{5.1}{4\\pi^2}, \\quad c=\\frac{5}{\\pi}, \\quad r = 6, \\quad s = 10, \\quad t=\\frac{1}{8\\pi}\n", "$$\n", "\n", "We will define the function over the space $\\Omega = [-5, 10] \\times [0, 15]$.\n", "\n", "We can create some samples of the function, $N=20$, sampled uniformly over the space:" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "def f(Xi, Xj): # Define Branin function\n", " a, b, c, r, s, t = 1., 5.1/(4*np.pi**2), 5/np.pi, 6., 10., 1./(8*np.pi)\n", " f_ = lambda x1,x2: \\\n", " a*(x2 - b*x1**2 + c*x1 - r)**2 + s*(1-t)*np.cos(x1) + s\n", " return np.reshape([f_(x1, x2) for (x1, x2) in zip(Xi[:], Xj[:])], Xi.shape)\n", " \n", "# Define the function space\n", "Xi, Xj = np.meshgrid(np.linspace(-5., 10., 50), np.linspace(0., 15., 50))\n", "\n", "# We will also define some samples\n", "Xsi, Xsj = np.meshgrid(np.linspace(-4.5, 9.5, 5), np.linspace(0.5, 14.5, 4))\n", "X = np.vstack((Xsi.ravel(), Xsj.ravel())).T\n", "# Our observations\n", "y = f(Xsi, Xsj).ravel()[:, None]\n", "\n", "# Plot the Branin function\n", "plt.figure(figsize=(14, 8))\n", "\n", "plt.contourf(Xi, Xj, f(Xi, Xj), levels=np.linspace(0, 300, 20))\n", "plt.plot(X[:,0], X[:,1], 'wo')\n", "\n", "plt.xlabel('$x_1$'), plt.ylabel('$x_2$'), plt.title(\"2-D Branin-Hoo function\")\n", "plt.legend(labels=[\"sample locations\"])\n", "plt.axis('square'), plt.colorbar();" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Exercise 1\n", "\n", "Using `GPy`, fit a GP that is the product of two kernels over the two different inputs. Compare this fit to that of a GP with an isotropic RBF kernel." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Analysis of Uncertainty\n", "\n", "Let X be a random variable defined over the real numbers, $\\mathbb{R}$, and let $f : \\mathbb{R} \\to \\mathbb{R}$ be a mapping between real numbers. Uncertainty analysis is the study of the distribution of the random variable $f(X)$.\n", "\n", "We will assume in this example that $f$ is the Branin-Hoo function defined above and that $X$ is a random variable with *uniform* distribution over the input space of $f$." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Computing $\\mathbb{E}[f(X)]$\n", "\n", "The expectation of $f(X)$ is given by $\\int_Xf(x)\\mathrm{d}x$. The basic approach to approximating the integral is to compute the mean of our 20 observations and compare it to the true value, $\\mathbb{E}[f(X)] = 54.31$:" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "np.mean(y) # Mean of our samples" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "However, we can do better by instead using Monte Carlo sampling of a Gaussian process. We define $10^5$ Monte Carlo samples over our function space:" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "# Define the function space\n", "Ximc, Xjmc = np.meshgrid(np.linspace(-5., 10., 100), np.linspace(0., 15., 100))\n", "Xmc = np.vstack((Ximc.ravel(), Xjmc.ravel())).T" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Exercise 2\n", "(a) Use your GP fit from the previous exercise to calculate the mean of the samples evaluated over your model of the Branin function. Discuss how it compares to the sample mean and true expectation of the Branin function." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [] }, { "cell_type": "markdown", "metadata": {}, "source": [ "(b) Can you utilise the prediction variance to create a confidence interval for the prediction of the mean? Does the true value lie within this bound?" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Computing $P(f(X) > 200)$\n", "\n", "In many cases, we might be interested in looking at the probability that $f(x)$ will take the value above a given threshold. For example, in the case that $f$ is the response of some physical model that represents the maximum constraint in a structure, it may be necessary to ascertain the probability that the constraint exceeds some maximum value." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Exercise 3\n", "(a) Use the original 20 observations to calculate an approximate estimate of the probabilty that $f(X) > 200$." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [] }, { "cell_type": "markdown", "metadata": {}, "source": [ "(b) Compute the probability that the best predictor of your GP is greater than the threshold ($200$), with confidence intervals. How does your estimation compare with the true value: $P(f(X) > 200) = 1.23\\times 10^{−2}$." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [] }, { "cell_type": "markdown", "metadata": {}, "source": [ "(c) Can you design and implement a procedure that updates the GP model sequentially with new points in order to improve the estimation of $P(f(X) > 200)$?" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Credit\n", "\n", "This notebook was written by Wil Ward. It is adapted from notebooks by [Rich Wilkinson](https://rich-d-wilkinson.github.io/) and [Neil Lawrence](http://inverseprobability.com/)." ] } ], "metadata": { "kernelspec": { "display_name": "Python 3", "language": "python", "name": "python3" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.7.3" } }, "nbformat": 4, "nbformat_minor": 2 }