{ "cells": [ { "cell_type": "markdown", "metadata": {}, "source": [ "# Big O Notation\n", "In this lecture we will go over how the syntax of Big-O Notation works and how we can describe algorithms using Big-O Notation!\n", "\n", "We previously discussed the functions below:" ] }, { "cell_type": "code", "execution_count": 1, "metadata": { "collapsed": true }, "outputs": [], "source": [ "# First function (Note the use of xrange since this is in Python 2)\n", "def sum1(n):\n", " '''\n", " Take an input of n and return the sum of the numbers from 0 to n\n", " '''\n", " final_sum = 0\n", " for x in xrange(n+1): \n", " final_sum += x\n", " \n", " return final_sum" ] }, { "cell_type": "code", "execution_count": 2, "metadata": { "collapsed": true }, "outputs": [], "source": [ "def sum2(n):\n", " \"\"\"\n", " Take an input of n and return the sum of the numbers from 0 to n\n", " \"\"\"\n", " return (n*(n+1))/2" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Now we want to develop a notation to objectively compare the efficiency of these two algorithms. A good place to start would be to compare the number of assignments each algorithm makes.\n", "\n", "The original **sum1** function will create an assignment **n+1** times, we can see this from the range based function. This means it will assign the final_sum variable n+1 times. We can then say that for a problem of n size (in this case just a number n) this function will take 1+n steps.\n", "\n", "This **n** notation allows us to compare solutions and algorithms relative to the size of the problem, since sum1(10) and sum1(100000) would take very different times to run but be using the same algorithm. We can also note that as n grows very large, the **+1** won't have much effect. So let's begin discussing how to build a syntax for this notation.\n", "________\n", "\n", "Now we will discuss how we can formalize this notation and idea.\n", "\n", "Big-O notation describes *how quickly runtime will grow relative to the input as the input get arbitrarily large*.\n", "\n", "Let's examine some of these points more closely:\n", "\n", "* Remember, we want to compare how quickly runtime will grows, not compare exact runtimes, since those can vary depending on hardware.\n", "\n", "* Since we want to compare for a variety of input sizes, we are only concerned with runtime grow *relative* to the input. This is why we use **n** for notation.\n", "\n", "* As n gets arbitrarily large we only worry about terms that will grow the fastest as n gets large, to this point, Big-O analysis is also known as **asymptotic analysis**\n", "\n", "\n", "As for syntax sum1() can be said to be **O(n)** since its runtime grows linearly with the input size. In the next lecture we will go over more specific examples of various O() types and examples. To conclude this lecture we will show the potential for vast difference in runtimes of Big-O functions.\n", "\n", "## Runtimes of Common Big-O Functions\n", "\n", "Here is a table of common Big-O functions:" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "
Big-O | \n", "Name | \n", "
---|---|
1 | \n", "Constant | \n", "
log(n) | \n", "Logarithmic | \n", "
n | \n", "Linear | \n", "
nlog(n) | \n", "Log Linear | \n", "
n^2 | \n", "Quadratic | \n", "
n^3 | \n", "Cubic | \n", "
2^n | \n", "Exponential | \n", "