{ "cells": [ { "cell_type": "markdown", "metadata": {}, "source": [ "# Lecture 4: Conditional Probability\n", "\n", "\n", "## Stat 110, Prof. Joe Blitzstein, Harvard University\n", "\n", "----" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Definitions\n", "\n", "We continue with some basic definitions of _independence_ and _disjointness_:\n", "\n", "#### Definition: independence & disjointness\n", "\n", "> Events A and B are __independent__ if $P(A \\cap B) = P(A)P(B)$. Knowing that event A occurs tells us nothing about event B.\n", "> \n", "> In contrast, events A and B are __disjoint__ if A occurring means that B cannot occur.\n", "\n", "What about the case of events A, B and c?\n", "\n", "> Events A, B and C are __independent__ if \n", ">\n", "> \\begin\\{align\\}\n", "> P(A \\cap B) &= P(A)P(B), ~~ P(A \\cap C) = P(A)P(C), ~~ P(B \\cap C) = P(B)P(C) \\\\\\\\\n", "> P(A \\cap B \\cap C) &= P(A)P(B)P(C)\n", "> \\end\\{align\\}\n", ">\n", "> So you need both _pair-wise independence and three-way independence_.\n" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Newton-Pepys Problem (1693)\n", "\n", "Yet another famous example of probability that comes from a [gambling question](https://en.wikipedia.org/wiki/Newton%E2%80%93Pepys_problem).\n", "\n", "We have fair dice. Which of the following events is most likely?\n", "\n", "- $A$ ... at least one 6 with 6 dice\n", "- $B$ ... at least two 6's with 12 dice\n", "- $C$ ... at least three 6's with 18 dice\n", "\n", "Let's solve for the probability of each event using independence.\n", "\n", "\\begin{align}\n", " P(A) &= 1 - P(A^c) ~~~~ &\\text{since the complement of at least one 6 is no 6's at all} \\\\\n", " &= 1 - \\left(\\frac{5}{6}\\right)^6 &\\text{the 6 dice are independent, so we just multiply them all} \\\\\n", " &\\approx 0.665 \\\\\n", " \\\\\n", " P(B) &= 1 - P(\\text{no 6's}) - P(\\text{one 6}) \\\\\n", " &= 1 - \\left(\\frac{5}{6}\\right)^{12} - 12 \\left(\\frac{1}{6}\\right)\\left(\\frac{5}{6}\\right)^{11} &\\text{... does this look familiar?}\\\\ \n", " &\\approx 0.619 \\\\\n", " \\\\\n", " P(C) &= 1 - P(\\text{no 6's}) - P(\\text{one 6}) - P(\\text{two 6's}) \\\\\n", " &= 1 - \\sum_{k=0}^{2} \\binom{18}{k} \\left(\\frac{1}{6}\\right)^k \\left(\\frac{5}{6}\\right)^{18-k} &\\text{... it's Binomial probability!} \\\\\n", " &\\approx 0.597\n", "\\end{align}\n" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Conditional Probability\n", "\n", "> Conditioning is the soul of probability.\n", "\n", "How do you update your beliefs when presented with new information? That's the question here.\n", "\n", "Consider 2 events $A$ and $B$. We defined _conditional probability_ a $P(A|B)$, read _the probability of A given B_.\n", "\n", "Suppose we just observed that $B$ occurred. Now if $A$ and $B$ are independent, then $P(A|B)$ is irrelevant. But if $A$ and $B$ are not independent, then the fact that $B$ happened is important information and we need to update our uncertainty about $A$ accordingly.\n", "\n", "#### Definition: conditional probability\n", "\n", "> \\begin\\{align\\}\n", "> \\text{conditional probability } P(A|B) &= \\frac{P(A \\cap B)}{P(B)} &\\text{if }P(B)\\gt0 \\\\\n", "> \\end\\{align\\}\n", "\n", "Prof. Blitzstein gives examples of _Pebble World_ and _Frequentist World_ to help explain conditional probability, but I find that [Legos make things simple](https://www.countbayesie.com/blog/2015/2/18/bayes-theorem-with-lego)." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Theorem 1\n", "\n", "The intersection of events $A$ and $B$ can be given by\n", "\n", "\\begin{align}\n", " P(A \\cap B) = P(B) P(A|B) = P(A) P(B|A)\n", "\\end{align}\n", "\n", "Note that if $A$ and $B$ are independent, then conditioning on $B$ means nothing (and vice-versa) so $P(A|B) = P(A)$, and $P(A \\cap B) = P(A) P(B)$." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Theorem 2\n", "\n", "\n", "\\begin{align}\n", " P(A_1, A_2, ... A_n) = P(A_1)P(A_2|A_1)P(A_3|A_1,A_2)...P(A_n|A_1,A_2,...,A_{n-1})\n", "\\end{align}" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Theorem 3: Bayes' Theorem\n", "\n", "\\begin{align}\n", " P(A|B) = \\frac{P(B|A)P(A)}{P(B)} ~~~~ \\text{this follows from Theorem 1}\n", "\\end{align}" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "----\n", "\n", "## Appendix A: Bayes' Rule Expressed in Terms of Odds\n", "\n", "The _odds_ of an event with probability $p$ is $\\frac{p}{1-p}$. \n", "\n", "An event with probability $\\frac{3}{4}$ can be described as having odds _3 to 1 in favor_, or _1 to 3 against_.\n", "\n", "Let $H$ be the hypothesis, or the event we are interested in.\n", "\n", "Let $D$ be the evidence (event) we gather in order to study $H$.\n", "\n", "The _prior_ probability $P(H)$ is that for which $H$ is true __before__ we observe any new evidence $D$.\n", "\n", "The _posterior_ probability $P(H|D)$ is, of course, that which is __after__ we observed new evidence.\n", "\n", "The _likelihood ratio_ is defined as $\\frac{P(D|H)}{P(D^c|H^c)}$\n", "\n", "Applying Bayes' Rule, we can see how the _posterior odds_, _prior odds_ and _likelihood odds_ are related:\n", "\n", "\\begin{align}\n", " P(H|D) &= \\frac{P(D|H)P(H)}{P(D)} \\\\\n", " \\\\\n", " P(H^c|D) &= \\frac{P(D|H^c)P(H^c)}{P(D)} \\\\\n", " \\\\\n", " \\Rightarrow \\underbrace{\\frac{P(H|D)}{P(H^c|D)}}_{\\text{posterior odds of H}} &= \\underbrace{\\frac{P(H)}{P(H^c)}}_{\\text{prior odds of H}} \\times \\underbrace{\\frac{P(D|H)}{P(D|H^c)}}_{\\text{likelihood ratio}}\n", "\\end{align}\n", "\n", "----" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Appendix B: Translating Odds into Probability\n", "\n", "To go from _odds_ back to _probability_\n", "\n", "\\begin{align}\n", " p = \\frac{p/q}{1 + p/q} & &\\text{ for } q = 1-p\n", "\\end{align}\n", "\n", "----" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "View [Lecture 4: Conditional Probability | Statistics 110](http://bit.ly/2Mwpk11) on YouTube." ] } ], "metadata": { "kernelspec": { "display_name": "Python 3", "language": "python", "name": "python3" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.6.3" } }, "nbformat": 4, "nbformat_minor": 1 }