{ "cells": [ { "cell_type": "markdown", "metadata": { "slideshow": { "slide_type": "slide" } }, "source": [ "# ORAL EXAM EXAMPLE\n", "\n", "### Bayesian Machine Learning and Information Processing (5SSD0)" ] }, { "cell_type": "markdown", "metadata": { "slideshow": { "slide_type": "slide" } }, "source": [ "In this short notebook, we provide some examples of the type of questions that you can expect at the oral exam. In general, oral exams do not lend themselves well to proofing theorems or other exact mathematical manipulations. Instead, the focus is more on testing if you understand the conceptual ideas in this class. You should know what Bayesian machine learning (BML) is about, and talk about *how* you would solve the typical tasks that are assciated with BML.\n", "\n", "In particular, for the models that we discussed in the class, you should be able to talk about the four stages of solving problems by probabilistic modeling: (**1- Model specification**, **2- Parameter estimation** (i.e., learning from an observed data set using Bayesian inference)\n", "**3- Model evaluation** (how good is this (trained) model?), and **4- Apply model**, e.g., for prediction or classification of new data. For some models, we did not discuss all these stages, e.g., we did not dicuss Model Evaluation for all models. I don't expect you to read beyond the class notes, so if it's not treated in the class, then I don't expect you to know about it.\n", "\n", "The materials for the exam are unchanged, so it includes the notebooks (lessons 1-12) + probabilistic programming notebooks. Some notebooks contain in the first cell a link to extra \"mandatory\" materials that are also included in the tested materials (of course in the same spirit: try to understand). The Exercise notebooks are less important than for a written exam version. I advise you to read through them once and skip exercises with lots of mathematics. \n", "\n", "The style of the examination is conversational. We like to engage in a conversation with you about what you learned in the class. Finally, don't get too stressed out about this exam style. If you read and understand the notebooks reasonably well, then you can pose some obvious exam questions to yourself.\n", "\n", "Next we present some example questions." ] }, { "cell_type": "markdown", "metadata": { "slideshow": { "slide_type": "slide" } }, "source": [ "### Question 1: Regression\n", "\n", "- What is regression?\n", "- Which of the following two models do you recognize as a regression model? \n", "\n", " (a) For $x_n \\in \\mathbb{R}^M$, $y_n \\in \\mathbb{R}$, $\\phi(a) = 1/(1+\\exp(-a))$\n", "\n", " $$\n", " y_n = w^T \\phi(x_n) + \\epsilon_n; \\qquad \\epsilon_n \\sim \\mathcal{N}(0,\\sigma^2)\n", " $$\n", " \n", " (b) For $x_n \\in \\mathbb{R}^M$, $y_n \\in \\{0,1\\}$, $\\phi(a) = 1/(1+\\exp(-a))$ \n", "\n", " $$\n", " p(y_n|x_n) = \\phi(w^T x_n + \\epsilon_n); \\qquad \\epsilon_n \\sim \\mathcal{N}(0,\\sigma^2)\n", " $$\n", " \n", "- Let's train the parameters $w$ for this model by Bayesian inference. How would you go about this task?\n", "\n", "- Let's set a prior for the weights $w$? Any suggestions?\n", "\n", "- How do you compute the posterior from the prior?\n", "\n", "- Is the posterior a Gaussian distribution? \n", " \n", "\n", "\n", "\n", " " ] }, { "cell_type": "markdown", "metadata": { "slideshow": { "slide_type": "slide" } }, "source": [ "### Question 2: Variational Inference\n", "\n", "- What is Variational Bayesian inference (VB)?\n", "\n", "- How is VB related to Bayes rule?\n", "\n", "- Consider a generative model $p(x,z|m)$ with $x$ observed and $z$ hidden variables. We observe a data set $D = \\{x_1, x_2, \\ldots, x_N\\}$ and define a free energy (FE) functional as (shares screen)\n", "$$ F[q] = \\sum_z q(z) \\log \\frac{q(z)}{p(D,z|m)}$$\n", "\n", "- How would you use this FE functional to find the posterior $p(z|D)$?\n", "\n", "- Can you also use FE energy minimization to estimate Bayesian model evidence $p(D|m)$?\n", "\n", "- What is the mean-field assumption? How does that help with FE minimization?" ] }, { "cell_type": "code", "execution_count": 6, "metadata": { "slideshow": { "slide_type": "skip" } }, "outputs": [ { "data": { "text/html": [ "\n", "\n", "\n", "\n", "\n", "\n", "\n", "\n", "\n", "\n", "\n", "\n", "\n", "\n" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "open(\"../../styles/aipstyle.html\") do f\n", " display(\"text/html\", read(f,String))\n", "end" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [] } ], "metadata": { "celltoolbar": "Slideshow", "kernelspec": { "display_name": "Julia 1.5.2", "language": "julia", "name": "julia-1.5" }, "language_info": { "file_extension": ".jl", "mimetype": "application/julia", "name": "julia", "version": "1.5.2" } }, "nbformat": 4, "nbformat_minor": 4 }