{ "metadata": { "name": "" }, "nbformat": 3, "nbformat_minor": 0, "worksheets": [ { "cells": [ { "cell_type": "markdown", "metadata": {}, "source": [ "

Machine Learning and Optimization with Python


\n", "\n", "This part of the advanced tutorial focuses more on some specific packages.\n", "The topics covered in this part are:\n", "\n", "- Easy and effective Machine Learning in Python with [Scikit-Learn](http://scikit-learn.org/stable/). Presented by [Maruan Al-Shedivat](https://github.com/mar-one).\n", "- Optimization in Python with [OpenOpt](http://openopt.org/). Presented by [Yiannis Hadjimichael](https://github.com/hadjimy).\n", "\n", "**Prerequisites:** *Basic and Intermediate tutorials or some experience with Python, Numpy, Scipy, Matplotlib (optional). Basic knowledge in machine learning and optimization*." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# 1. Machine Learning in Python\n", "Machine learning is a hot and popular topic nowadays. It is about building programs with tunable parameters that are adjusted automatically by adapting to previously seen data. It intervenes literally every sphere of our life ranging from science, engineering, and industry to economics, sociology, and even [personal relationships](http://www.wired.com/wiredscience/2014/01/how-to-hack-okcupid/all/). So if it is applicable everywhere, we need to have handy tools for doing machine learning." ] }, { "cell_type": "code", "collapsed": false, "input": [ "# Start pylab inline mode, so figures will appear in the notebook\n", "%pylab inline" ], "language": "python", "metadata": {}, "outputs": [] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### 1.1. Quick Overview of Scikit-Learn\n", "*Adapted from http://scikit-learn.org/stable/tutorial/basic/tutorial.html*" ] }, { "cell_type": "code", "collapsed": false, "input": [ "# Load a dataset already available in scikit-learn\n", "from sklearn import datasets\n", "digits = datasets.load_digits()\n", "type(digits)" ], "language": "python", "metadata": {}, "outputs": [] }, { "cell_type": "code", "collapsed": false, "input": [ "digits.data" ], "language": "python", "metadata": {}, "outputs": [] }, { "cell_type": "code", "collapsed": false, "input": [ "digits.target" ], "language": "python", "metadata": {}, "outputs": [] }, { "cell_type": "code", "collapsed": false, "input": [ "from sklearn import svm\n", "clf = svm.SVC(gamma=0.001, C=100.)\n", "clf.fit(digits.data[:-1], digits.target[:-1])\n", "\n", "print \"Original label:\", digits.target[5]\n", "print \"Predicted label:\", clf.predict(digits.data[5])\n", "\n", "plt.figure(figsize=(3, 3))\n", "plt.imshow(digits.images[5], interpolation='nearest', cmap=plt.cm.binary)" ], "language": "python", "metadata": {}, "outputs": [] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### 1.2. Data representation in Scikit-learn\n", "*Adapted from https://github.com/jakevdp/sklearn_scipy2013*" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Data in scikit-learn, with very few exceptions, is assumed to be stored as a **two-dimensional array**, of size `[n_samples, n_features]`, and most of machine learning algorithms implemented in scikit-learn expect data to be stored this way.\n", "\n", "- **n_samples:** The number of samples: each sample is an item to process (e.g. classify). A sample can be a document, a picture, a sound, a video, an astronomical object, a row in database or CSV file, or whatever you can describe with a fixed set of quantitative traits.\n", "- **n_features:** The number of features or distinct traits that can be used to describe each item in a quantitative manner. Features are generally real-valued, but may be boolean or discrete-valued in some cases.\n", "\n", "The number of features must be fixed in advance. However it can be very high dimensional (e.g. millions of features) with most of them being zeros for a given sample. This is a case where scipy.sparse matrices can be useful, in that they are much more memory-efficient than numpy arrays." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "#### 1.2.1. Example: `iris dataset`\n", "To understand these simple concepts better, let's work with an example: `iris-dataset`." ] }, { "cell_type": "code", "collapsed": false, "input": [ "# Don't know what iris is? Run this code to generate some pictures.\n", "from IPython.core.display import Image, display\n", "display(Image(filename='figures/iris_setosa.jpg'))\n", "print \"Iris Setosa\\n\"\n", "\n", "display(Image(filename='figures/iris_versicolor.jpg'))\n", "print \"Iris Versicolor\\n\"\n", "\n", "display(Image(filename='figures/iris_virginica.jpg'))\n", "print \"Iris Virginica\"" ], "language": "python", "metadata": {}, "outputs": [] }, { "cell_type": "code", "collapsed": false, "input": [ "# Load the dataset from scikit-learn\n", "from sklearn.datasets import load_iris\n", "iris = load_iris()\n", "type(iris) # The resulting dataset is a Bunch object" ], "language": "python", "metadata": {}, "outputs": [] }, { "cell_type": "code", "collapsed": false, "input": [ "iris.keys()" ], "language": "python", "metadata": {}, "outputs": [] }, { "cell_type": "code", "collapsed": false, "input": [ "print iris.DESCR" ], "language": "python", "metadata": {}, "outputs": [] }, { "cell_type": "code", "collapsed": false, "input": [ "# Number of samples and features\n", "iris.data.shape\n", "print \"Number of samples:\", iris.data.shape[0]\n", "print \"Number of features:\", iris.data.shape[1]\n", "print iris.data[0]" ], "language": "python", "metadata": {}, "outputs": [] }, { "cell_type": "code", "collapsed": false, "input": [ "# Target (label) names\n", "print iris.target_names" ], "language": "python", "metadata": {}, "outputs": [] }, { "cell_type": "code", "collapsed": false, "input": [ "# Visualize the data\n", "x_index = 0\n", "y_index = 1\n", "\n", "# this formatter will label the colorbar with the correct target names\n", "formatter = plt.FuncFormatter(lambda i, *args: iris.target_names[int(i)])\n", "\n", "plt.figure(figsize=(7, 5))\n", "plt.scatter(iris.data[:, x_index], iris.data[:, y_index], c=iris.target)\n", "plt.colorbar(ticks=[0, 1, 2], format=formatter)\n", "plt.xlabel(iris.feature_names[x_index])\n", "plt.ylabel(iris.feature_names[y_index])" ], "language": "python", "metadata": {}, "outputs": [] }, { "cell_type": "markdown", "metadata": {}, "source": [ "**Quick Exercise:**\n", "Change `x_index` and `y_index` and find the 2D projection of the data which maximally separate all three classes." ] }, { "cell_type": "code", "collapsed": false, "input": [ "# your code goes here" ], "language": "python", "metadata": {}, "outputs": [] }, { "cell_type": "markdown", "metadata": {}, "source": [ "#### 1.2.2. Other available datasets\n", "Scikit-learn makes available a host of datasets for testing learning algorithms. They come in three flavors:\n", "\n", "- **Packaged Data**: these small datasets are packaged with the scikit-learn installation, and can be downloaded using the tools in `sklearn.datasets.load_*`\n", "- **Downloadable Data**: these larger datasets are available for download, and scikit-learn includes tools which streamline this process. These tools can be found in `sklearn.datasets.fetch_*`\n", "- **Generated Data**: there are several datasets which are generated from models based on a random seed. These are available in the `sklearn.datasets.make_*`" ] }, { "cell_type": "code", "collapsed": false, "input": [ "# Where the downloaded datasets via fetch_scripts are stored\n", "from sklearn.datasets import get_data_home\n", "get_data_home()" ], "language": "python", "metadata": {}, "outputs": [] }, { "cell_type": "code", "collapsed": false, "input": [ "# How to fetch an external dataset\n", "from sklearn import datasets" ], "language": "python", "metadata": {}, "outputs": [] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Now you can type `datasets.fetch_` and a list of available datasets will pop up." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "#### Exercise:\n", "Fetch the dataset `olivetti_faces`. Visualize various 2D projections for the data. Find the best in terms of separability." ] }, { "cell_type": "code", "collapsed": false, "input": [ "from sklearn.datasets import fetch_olivetti_faces" ], "language": "python", "metadata": {}, "outputs": [] }, { "cell_type": "code", "collapsed": false, "input": [ "# your code goes here\n", "# fetch the faces data\n", "# Use a script like above to plot the faces image data.\n", "# Hint: plt.cm.bone is a good colormap for this data." ], "language": "python", "metadata": {}, "outputs": [] }, { "cell_type": "code", "collapsed": false, "input": [ "#%load solutions/sklearn_vis_data.py" ], "language": "python", "metadata": {}, "outputs": [] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### 1.3. Linear regression and classification\n", "In this section, we explain how to use scikit-learn library for regression and classification. We discuss `Estimator` class - the base class for many of the scikit-learn models - and work through some examples." ] }, { "cell_type": "code", "collapsed": false, "input": [ "from sklearn.datasets import load_boston\n", "data = load_boston()\n", "print data.DESCR" ], "language": "python", "metadata": {}, "outputs": [] }, { "cell_type": "code", "collapsed": false, "input": [ "# Plot histogram for real estate prices\n", "plt.hist(data.target)\n", "plt.xlabel('price ($1000s)')\n", "plt.ylabel('count')" ], "language": "python", "metadata": {}, "outputs": [] }, { "cell_type": "markdown", "metadata": {}, "source": [ "**Quick Exercise:** Try sctter-plot several 2D projections of the data. What you can infer from the plots?" ] }, { "cell_type": "code", "collapsed": false, "input": [ "# your code goes here" ], "language": "python", "metadata": {}, "outputs": [] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Every algorithm is exposed in scikit-learn via an `Estimator` object. For instance a linear regression is:" ] }, { "cell_type": "code", "collapsed": false, "input": [ "from sklearn.linear_model import LinearRegression\n", "clf = LinearRegression()\n", "clf.fit(data.data, data.target)" ], "language": "python", "metadata": {}, "outputs": [] }, { "cell_type": "code", "collapsed": false, "input": [ "predicted = clf.predict(data.data)" ], "language": "python", "metadata": {}, "outputs": [] }, { "cell_type": "code", "collapsed": false, "input": [ "# Plot the predicted price vs. true price\n", "plt.scatter(data.target, predicted)\n", "plt.plot([0, 50], [0, 50], '--k')\n", "plt.axis('tight')\n", "plt.xlabel('True price ($1000s)')\n", "plt.ylabel('Predicted price ($1000s)')" ], "language": "python", "metadata": {}, "outputs": [] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Examples of regression type problems in machine learning:\n", "\n", "- **Sales:** given consumer data, predict how much they will spend\n", "- **Advertising:** given information about a user, predict the click-through rate for a web ad.\n", "- **Collaborative Filtering:** given a collection of user-ratings for movies, predict preferences for other movies & users.\n", "- **Astronomy:** given observations of galaxies, predict their mass or redshift" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Another example of regularized linear regression on a synthetic data set." ] }, { "cell_type": "code", "collapsed": false, "input": [ "import numpy as np\n", "from sklearn.linear_model import LinearRegression\n", "\n", "rng = np.random.RandomState(0)\n", "x = 2 * rng.rand(100) - 1\n", "\n", "f = lambda t: 1.2 * t ** 2 + .1 * t ** 3 - .4 * t ** 5 - .5 * t ** 9\n", "y = f(x) + .4 * rng.normal(size=100)\n", "\n", "plt.figure(figsize=(7, 5))\n", "plt.scatter(x, y, s=4)" ], "language": "python", "metadata": {}, "outputs": [] }, { "cell_type": "code", "collapsed": false, "input": [ "x_test = np.linspace(-1, 1, 100)\n", "\n", "for k in [4, 9]:\n", " X = np.array([x**i for i in range(k)]).T\n", " X_test = np.array([x_test**i for i in range(k)]).T\n", " order4 = LinearRegression()\n", " order4.fit(X, y)\n", " plt.plot(x_test, order4.predict(X_test), label='%i-th order'%(k))\n", "\n", "plt.legend(loc='best')\n", "plt.axis('tight')\n", "plt.title('Fitting a 4th and a 9th order polynomial')" ], "language": "python", "metadata": {}, "outputs": [] }, { "cell_type": "code", "collapsed": false, "input": [ "# Let's look at the ground truth\n", "plt.figure()\n", "plt.scatter(x, y, s=4)\n", "plt.plot(x_test, f(x_test), label=\"truth\")\n", "plt.axis('tight')\n", "plt.title('Ground truth (9th order polynomial)')" ], "language": "python", "metadata": {}, "outputs": [] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Let's use [Ridge regression](http://scikit-learn.org/stable/modules/generated/sklearn.linear_model.Ridge.html) (with l1-norm regularization)" ] }, { "cell_type": "code", "collapsed": false, "input": [ "from sklearn.linear_model import Ridge\n", "\n", "for k in [20]:\n", " X = np.array([x**i for i in range(k)]).T\n", " X_test = np.array([x_test**i for i in range(k)]).T\n", " order4 = Ridge(alpha=0.3)\n", " order4.fit(X, y)\n", " plt.plot(x_test, order4.predict(X_test), label='%i-th order'%(k))\n", "plt.plot(x_test, f(x_test), label=\"truth\")\n", "plt.scatter(x, y, s=4)\n", "plt.legend(loc='best')\n", "plt.axis('tight')\n", "plt.title('Fitting a 4th and a 9th order polynomial')" ], "language": "python", "metadata": {}, "outputs": [] }, { "cell_type": "code", "collapsed": false, "input": [ "plt.ylim?" ], "language": "python", "metadata": {}, "outputs": [] }, { "cell_type": "markdown", "metadata": {}, "source": [ "#### Exercise:\n", "For this exercise, you need to classify iris data with nearest neighbors classifier and with linear support vector classifier (LinearSVC).\n", "In order to measure the performance, calculate the accuracy or just use `clf.score` function built-in into the `Estimator` object." ] }, { "cell_type": "code", "collapsed": false, "input": [ "from __future__ import division # turn off division truncation -- division will be floating point by default\n", "from sklearn.neighbors import KNeighborsClassifier\n", "from sklearn.svm import LinearSVC\n", "from sklearn.datasets import load_iris\n", "iris = load_iris()\n", "\n", "# Splitting the data into train and test sets\n", "indices = np.random.permutation(range(iris.data.shape[0]))\n", "train_sz = floor(0.8*iris.data.shape[0])\n", "X, y = iris.data[indices[:train_sz],:], iris.target[indices[:train_sz]]\n", "Xt, yt = iris.data[indices[train_sz:],:], iris.target[indices[train_sz:]]\n", "\n", "# your code goes here" ], "language": "python", "metadata": {}, "outputs": [] }, { "cell_type": "code", "collapsed": false, "input": [ "#%load solutions/iris_classification.py" ], "language": "python", "metadata": {}, "outputs": [] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### 1.4. Approaches to validation and testing\n", "It is usually an arduous and time consuming task to write cross-validation or other testing wrappers for your models. Scikit-learn allows you validate your models out-of-the-box. Let's have a look at how to do this." ] }, { "cell_type": "code", "collapsed": false, "input": [ "from sklearn import cross_validation\n", "X, Xt, y, yt = cross_validation.train_test_split(iris.data, iris.target, test_size=0.4, random_state=0)\n", "print X.shape, y.shape\n", "print Xt.shape, yt.shape" ], "language": "python", "metadata": {}, "outputs": [] }, { "cell_type": "code", "collapsed": false, "input": [ "# Cross-validating your models in one line\n", "scores = cross_validation.cross_val_score(svc, iris.data, iris.target, cv=5,)\n", "print scores\n", "print scores.mean()" ], "language": "python", "metadata": {}, "outputs": [] }, { "cell_type": "code", "collapsed": false, "input": [ "cross_validation.cross_val_score?" ], "language": "python", "metadata": {}, "outputs": [] }, { "cell_type": "code", "collapsed": false, "input": [ "# Calculate F1-measure (or any other score)\n", "from sklearn import metrics\n", "f1_scores = cross_validation.cross_val_score(svc, iris.data, iris.target, cv=5, scoring='f1')\n", "print f1_scores\n", "print f1_scores.mean()" ], "language": "python", "metadata": {}, "outputs": [] }, { "cell_type": "markdown", "metadata": {}, "source": [ "**Quick Exercise:** Do the same CV-evaluation of the KNN classifier. Find the best number of nearest neighbors." ] }, { "cell_type": "code", "collapsed": false, "input": [], "language": "python", "metadata": {}, "outputs": [] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### 1.5. What to learn next?\n", "If you would like to start using Scikit-learn for your machine learning tasks, the most right way is to jump into it and refer to [the documentation](http://scikit-learn.org/stable/documentation.html) from time to time. When you acquire enough skills in it, you can go and check out the following resources:\n", "\n", "- [A tutorial from PyCon 2013](https://github.com/jakevdp/sklearn_pycon2013)\n", "- [A tutorial from SciPy 2013 conference](https://github.com/jakevdp/sklearn_scipy2013) (check out the advanced track!)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "![Scikit-learn algorithm cheatsheet](http://scikit-learn.org/dev/_static/ml_map.png)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# 2. Optimization in Python with OpenOpt" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "In this part, we'll learn how to solve optimization problems using the **OpenOpt** package.\n", "\n", "**OpenOpt** is a free-source package (under the license: [BSD](http://openopt.org/license)) for numerical optimization of a variety\n", "of linear, nonlinear problems, least-square problems, network and stochastic problems.\n", "\n", "It contains more than 30 different solvers (both free and commercial) including solvers written in C and Fortran.\n", "Some solvers are already included in the OpenOpt package (e.g. [`ralg`](http://openopt.org/ralg), \n", "[`gsubg`](http://openopt.org/gsubg), [`interalg`](http://openopt.org/interalg)) and other rely on other packagess \n", "(e.g. [`scipy`](http://www.scipy.org), [`IPOPT`](http://openopt.org/IPOPT), [`ALGENCAN`](http://openopt.org/ALGENCAN))\n", "\n", "In this demonstration we are considering solvers existing in OpenOpt and SciPy. The library workstations have already OpenOpt\n", "installed and Scipy is included in the Python distribution.\n", "For instructions how to install OpenOpt on other machines go to: [http://openopt.org/Install](http://openopt.org/Install). " ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Acknowledgements:\n", "\n", "- OpenOpt's [website](http://openopt.org)." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Optimization formulation" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "The general formulation of an optimization problem is to \n", "An optimization problem can be expressed in the following way:\n", "\n", "- Given: a function $f : A \\to R$ from some set $A$ to the set of real numbers\n", "- Want to find a solution $x_0 \\in A$ such that $f(x_0)$ is the minimum (or maximum) of $f(x)$ for all $x \\in A$." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "For example the standard form of a continuous optimization problem is:\n", "\n", "$\n", " \\text{max (or min) } \\mathbf{f(x)}\n", "$\n", "\n", "$\n", " \\text{subject to } \\\\\n", "$\n", "\n", "$ \n", " \\mathbf{lb} \\le \\mathbf{x} \\le \\mathbf{ub} \\\\\n", " \\mathbf{A x} \\le \\mathbf{b} \\\\\n", " \\mathbf{A}_\\mathbf{eq} \\mathbf{x} = \\mathbf{b}_\\mathbf{eq} \\\\\n", " \\mathbf{c_i(x) \\le 0} \\quad \\forall i=0,...,I \\\\\n", " \\mathbf{h_j(x) = 0} \\quad \\forall i=0,...,J \\\\\n", " \\mathbf{x \\in R^n}\n", "$\n", "\n", "where $\\mathbf{f, c_i, h_j} :R^n \\to R$." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "According to the type of the objective function $\\mathbf{f(x)}$ and constraints the optimization problem is either linear, quadratic, nonlinear, unconstrained, constrained etc." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### OpenOpt syntax\n", "\n", "In *OpenOpt* optimization problems are assigned in the following way:\n", "\n", "> from openopt import NLP\n", "\n", "or other constructor names: LP, MILP, QP, etc according to the problem\n", "\n", "Then we use\n", "\n", "> p = NLP(\\*args, \\**kwargs)\n", "\n", "to assign the constructor to object *p*. \n", "\n", "Each constructor class has its own arguments\n", "\n", "e.g. for NLP it is least *f* and *x0*, the objective function and the initial guess.\n", "\n", "thus using \n", "\n", "> p = NLP(objfun, initialguess)\n", "\n", "will assign *objfun* to f and *initialguess* to x0.\n", "\n", "We can use \n", "\n", "> p.kwargs\n", "\n", "to change any *kwargs* defined in the constructor.\n", "\n", "Finally we solve the problem by\n", "\n", "> r = p.solve(nameOfSolver, otherParameters)\n", "\n", "Object *r* contains all results, i.e. the minimizer, the objective function at solution and output\n", "parameters. " ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Adding OpenOpt path to PYTHONPATH\n", "\n", "The best way to see how it works is to go through some examples.\n", "\n", "But since OpenOpt is an additional package not included in the Python distribution we use first we need to include its path to the `PYTHONPATH`.\n", "\n", "If you are using a library workstation this can be done in the notebook by running:" ] }, { "cell_type": "code", "collapsed": false, "input": [ "import sys\n", "sys.path.append('/Library/Python/2.7/site-packages/openopt-0.52-py2.7.egg')" ], "language": "python", "metadata": {}, "outputs": [] }, { "cell_type": "markdown", "metadata": {}, "source": [ "On other machines (Linux or Mac) add the following line in the .bashrc (Linux) or .bash_profile (Mac):\n", "\n", "> export OPENOPT=path-where-the-OOSuite-is\n", "\n", "> export PYTHONPATH=\\$PYTHONPATH:\\$OPENOPT\n", "\n", "If you don't have such a file then create one." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Example 1 : Linear problem\n", "\n", "A linear optimization problem has the following form:\n", "\n", "$\n", " \\text{max (or min) } \\mathbf{c}^T\\mathbf{x}\n", "$\n", "\n", "$\n", " \\text{subject to } \\\\\n", "$\n", "\n", "$ \n", " \\mathbf{lb} \\le \\mathbf{x} \\le \\mathbf{ub} \\\\\n", " \\mathbf{A x} \\le \\mathbf{b} \\\\\n", " \\mathbf{A}_\\mathbf{eq} \\mathbf{x} = \\mathbf{b}_\\mathbf{eq} \\\\\n", " \\mathbf{x \\in R^n}\n", "$\n", "\n", "Consider the following problem adapted from OpenOpt examples on [Linear Problems](http://trac.openopt.org/openopt/browser/PythonPackages/OpenOpt/openopt/examples/lp_1.py):" ] }, { "cell_type": "code", "collapsed": false, "input": [ "\"\"\"\n", "Example:\n", "Let's concider the problem\n", "15x1 + 8x2 + 80x3 -> min (1)\n", "subjected to\n", "x1 + 2x2 + 3x3 <= 15 (2)\n", "8x1 + 15x2 + 80x3 <= 80 (3)\n", "8x1 + 80x2 + 15x3 <=150 (4)\n", "-100x1 - 10x2 - x3 <= -800 (5)\n", "80x1 + 8x2 + 15x3 = 750 (6)\n", "x1 + 10x2 + 100x3 = 80 (7)\n", "x1 >= 4 (8)\n", "-8 >= x2 >= -80 (9)\n", "\"\"\"\n", "\n", "from numpy import *\n", "from openopt import LP\n", "f = array([15, 8, 80])\n", "A = array([[1, 2, 3], [8, 15, 80], [8, 80, 15], [-100, -10, -1]]) \n", "b = array([15, 80, 150, -800]) \n", "Aeq = array([[80, 8, 15], [1, 10, 100]]) \n", "beq = array([750, 80])\n", "\n", "lb = array([4, -80, -inf])\n", "ub = array([inf, -8, inf])\n", "p = LP(f, A=A, Aeq=Aeq, b=b, beq=beq, lb=lb, ub=ub)\n", "#or p = LP(f=f, A=A, Aeq=Aeq, b=b, beq=beq, lb=lb, ub=ub)\n", "\n", "#r = p.minimize('glpk') # CVXOPT must be installed\n", "#r = p.minimize('lpSolve') # lpsolve must be installed\n", "r = p.minimize('pclp') \n", "#search for max: r = p.maximize('glpk') # CVXOPT & glpk must be installed\n", "\n", "print('objFunValue: %f' % r.ff) # should print 204.48841578\n", "print('x_opt: %s' % r.xf) # should print [ 9.89355041 -8. 1.5010645 ]" ], "language": "python", "metadata": {}, "outputs": [] }, { "cell_type": "markdown", "metadata": {}, "source": [ "We can easily verify that solution satisfies the constraints:" ] }, { "cell_type": "code", "collapsed": false, "input": [ "import numpy as np" ], "language": "python", "metadata": {}, "outputs": [] }, { "cell_type": "code", "collapsed": false, "input": [ "print np.linalg.norm(np.dot(Aeq,r.xf)- beq)\n", "print np.dot(A,r.xf)- b <= 0" ], "language": "python", "metadata": {}, "outputs": [] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Example 2: Solving nonlinear equations\n", "\n", "Solving nonlinear equations can be formulated as minimizing the 2-norm of the vector of equations.\n", "\n", "Below is a script solving a system of three equations (from OpenOpt examples on [Non-linear least squares](http://trac.openopt.org/openopt/browser/PythonPackages/OpenOpt/openopt/examples/nllsp_1.py))." ] }, { "cell_type": "code", "collapsed": false, "input": [ "\"\"\"\n", "Let us solve the overdetermined nonlinear equations:\n", "a^2 + b^2 = 15\n", "a^4 + b^4 = 100\n", "a = 3.5\n", "\n", "Let us concider the problem as\n", "x[0]**2 + x[1]**2 - 15 = 0\n", "x[0]**4 + x[1]**4 - 100 = 0\n", "x[0] - 3.5 = 0\n", "\n", "Now we will solve the one using solver scipy_leastsq\n", "\"\"\"\n", "%pylab inline --no-import-all\n", "from openopt import NLLSP\n", "from openopt import NLP\n", "\n", "f = lambda x: ((x**2).sum() - 15, (x**4).sum() - 100, x[0]-3.5)\n", "# other possible f assignments:\n", "# f = lambda x: [(x**2).sum() - 15, (x**4).sum() - 100, x[0]-3.5]\n", "#f = [lambda x: (x**2).sum() - 15, lambda x: (x**4).sum() - 100, lambda x: x[0]-3.5]\n", "# f = (lambda x: (x**2).sum() - 15, lambda x: (x**4).sum() - 100, lambda x: x[0]-3.5)\n", "# f = lambda x: asfarray(((x**2).sum() - 15, (x**4).sum() - 100, x[0]-3.5))\n", "#optional: gradient\n", "def df(x):\n", " r = zeros((3,2))\n", " r[0,0] = 2*x[0]\n", " r[0,1] = 2*x[1]\n", " r[1,0] = 4*x[0]**3\n", " r[1,1] = 4*x[1]**3\n", " r[2,0] = 1\n", " return r\n", "\n", "# init esimation of solution - sometimes rather pricise one is very important\n", "x0 = [1.5, 8]\n", "\n", "#p = NLLSP(f, x0, diffInt = 1.5e-8, xtol = 1.5e-8, ftol = 1.5e-8)\n", "# or\n", "# p = NLLSP(f, x0)\n", "# or\n", "p = NLLSP(f, x0, xtol = 1.5e-8, ftol = 1.5e-8)\n", "\n", "#optional: user-supplied gradient check:\n", "#p.checkdf() # requires DerApproximator to be installed\n", "\n", "r = p.solve('scipy_leastsq', plot=1, iprint = 1)\n", "\n", "print 'x_opt:', r.xf # 2.74930862, +/-2.5597651\n", "print 'funcs Values:', p.f(r.xf) # [-0.888904734668, 0.0678251418575, -0.750691380965]\n", "print 'f_opt:', r.ff, '; sum of squares (should be same value):', (p.f(r.xf) ** 2).sum() # 1.35828942657\n" ], "language": "python", "metadata": {}, "outputs": [] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Example 3: Least square problem\n", "\n", "Consider some 2D data. We would like to find a model to the measured data that minimizes the discrepancy between model predictions and the data. This can be formulated by minimizing the sum of the squares of the difference between values of model and measured data." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "First create some data and plot them:" ] }, { "cell_type": "code", "collapsed": false, "input": [ "import matplotlib.pyplot as plt" ], "language": "python", "metadata": {}, "outputs": [] }, { "cell_type": "code", "collapsed": false, "input": [ "n = 200\n", "X = np.linspace(0,2,n)\n", "m = lambda X : 0.8*np.exp(-1.5*X)+1.2*np.exp(-0.8*X)\n", "perturb = 0.2*np.random.uniform(0,1,n)\n", "Y = m(X)*(1.0+perturb)" ], "language": "python", "metadata": {}, "outputs": [] }, { "cell_type": "code", "collapsed": false, "input": [ "plt.figure(figsize=(15,10), dpi=100) # changing figure's shape\n", "plt.plot(X,Y,'.')\n", "plt.xlabel('$X$',fontsize=16) # horizontal axis name\n", "plt.ylabel('Y',fontsize=16) # vertical axis name\n", "plt.title('Sample data',fontsize=18) # title \n", "plt.grid(True) # enabling grid" ], "language": "python", "metadata": {}, "outputs": [] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Now let's solve the least square problem:" ] }, { "cell_type": "code", "collapsed": false, "input": [ "\"\"\"\n", "In the DFP example we will search for z=(a, b, c, d) \n", "that minimizes Sum_i || F(z, X_i) - Y_i ||^2\n", "for the function\n", "F(x) = a*exp(-b*x) + c*exp(-d*x)\n", "\"\"\"\n", "from openopt import DFP\n", "\n", "f = lambda z, X: z[0]*exp(-z[1]*X[0]) + z[2]*exp(-z[3]*X[0])\n", "initEstimation = [0.2, 0.9, 1.9, 0.9]\n", "lb = [-np.inf, -np.inf, -np.inf, -np.inf]\n", "ub = [np.inf, np.inf, np.inf, np.inf]\n", "p = DFP(f, initEstimation, X, Y, lb=lb, ub=ub)\n", "\n", "p.df = lambda z, X: [exp(-z[1]*X[0]), -z[0]*z[1]*exp(-z[1]*X[0]), exp(-z[3]*X[0]), -z[2]*z[3]*exp(-z[3]*X[0])]\n", "\n", "r = p.solve('nlp:ralg', plot=1, iprint = 10)\n", "f = lambda z, X: z[0]*exp(-z[1]*X) + z[2]*exp(-z[3]*X)\n", "print('solution: '+str(r.xf)+'\\n||residuals||^2 = '+str(r.ff))" ], "language": "python", "metadata": {}, "outputs": [] }, { "cell_type": "code", "collapsed": false, "input": [ "fun = lambda z, X: z[0]*exp(-z[1]*X) + z[2]*exp(-z[3]*X)\n", "\n", "plt.figure(figsize=(15,10), dpi=100) # changing figure's shape\n", "plt.plot(X,Y,'.')\n", "plt.plot(X,fun(r.xf,X),'r-',linewidth=1.5)\n", "plt.xlabel('$X$',fontsize=16) # horizontal axis name\n", "plt.ylabel('Y',fontsize=16) # vertical axis name\n", "plt.title('Sample data',fontsize=18) # title \n", "plt.grid(True) # enabling grid" ], "language": "python", "metadata": {}, "outputs": [] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Example 4: Nonlinear problem\n", "\n", "Finally, the following is an example of a non-linear problem (from OpenOpt [non-linear examples](http://trac.openopt.org/openopt/browser/PythonPackages/OpenOpt/openopt/examples/nlp_1.py)) as described at the beginning of this part." ] }, { "cell_type": "code", "collapsed": false, "input": [ "\"\"\"\n", "Example:\n", "(x0-5)^2 + (x2-5)^2 + ... +(x149-5)^2 -> min\n", "\n", "subjected to\n", "\n", "# lb<= x <= ub:\n", "x4 <= 4\n", "8 <= x5 <= 15\n", "\n", "# Ax <= b\n", "x0+...+x149 >= 825\n", "x9 + x19 <= 3\n", "x10+x11 <= 9\n", "\n", "# Aeq x = beq\n", "x100+x101 = 11\n", "\n", "# c(x) <= 0\n", "2*x0^4-32 <= 0\n", "x1^2+x2^2-8 <= 0\n", "\n", "# h(x) = 0\n", "(x[149]-1)**6 = 0\n", "(x[148]-1.5)**6 = 0\n", "\"\"\"\n", "\n", "from openopt import NLP\n", "from numpy import cos, arange, ones, asarray, zeros, mat, array\n", "N = 150\n", "\n", "# objective function:\n", "f = lambda x: ((x-5)**2).sum()\n", "\n", "# objective function gradient (optional):\n", "df = lambda x: 2*(x-5)\n", "\n", "# start point (initial estimation)\n", "x0 = 8*cos(arange(N))\n", "\n", "# c(x) <= 0 constraints\n", "c = [lambda x: 2* x[0] **4-32, lambda x: x[1]**2+x[2]**2 - 8]\n", "\n", "# dc(x)/dx: non-lin ineq constraints gradients (optional):\n", "dc0 = lambda x: [8 * x[0]**3] + [0]*(N-1)\n", "dc1 = lambda x: [0, 2 * x[1], 2 * x[2]] + [0]*(N-3)\n", "dc = [dc0, dc1]\n", "\n", "# h(x) = 0 constraints\n", "def h(x):\n", " return (x[N-1]-1)**6, (x[N-2]-1.5)**6\n", " # other possible return types: numpy array, matrix, Python list, tuple\n", "# or just h = lambda x: [(x[149]-1)**6, (x[148]-1.5)**6]\n", "\n", "\n", "# dh(x)/dx: non-lin eq constraints gradients (optional):\n", "def dh(x):\n", " r = zeros((2, N))\n", " r[0, -1] = 6*(x[N-1]-1)**5\n", " r[1, -2] = 6*(x[N-2]-1.5)**5\n", " return r\n", " \n", "# lower and upper bounds on variables\n", "lb = -6*ones(N)\n", "ub = 6*ones(N)\n", "ub[4] = 4\n", "lb[5], ub[5] = 8, 15\n", "\n", "# general linear inequality constraints\n", "A = zeros((3, N))\n", "A[0, 9] = 1\n", "A[0, 19] = 1\n", "A[1, 10:12] = 1\n", "A[2] = -ones(N)\n", "b = [7, 9, -825]\n", "\n", "# general linear equality constraints\n", "Aeq = zeros(N)\n", "Aeq[100:102] = 1\n", "beq = 11\n", "\n", "# required constraints tolerance, default for NLP is 1e-6\n", "contol = 1e-7\n", "\n", "# If you use solver algencan, NB! - it ignores xtol and ftol; using maxTime, maxCPUTime, maxIter, maxFunEvals, fEnough is recommended.\n", "# Note that in algencan gtol means norm of projected gradient of the Augmented Lagrangian\n", "# so it should be something like 1e-3...1e-5\n", "gtol = 1e-7 # (default gtol = 1e-6)\n", "\n", "# Assign problem:\n", "# 1st arg - objective function\n", "# 2nd arg - start point\n", "p = NLP(f, x0, df=df, c=c, dc=dc, h=h, dh=dh, A=A, b=b, Aeq=Aeq, beq=beq, \n", " lb=lb, ub=ub, gtol=gtol, contol=contol, iprint = 50, maxIter = 10000, maxFunEvals = 1e7, name = 'NLP_1')\n", "\n", "#optional: graphic output, requires pylab (matplotlib)\n", "p.plot = True\n", "\n", "solver = 'ralg'\n", "#solver = 'scipy_slsqp'\n", "\n", "# solve the problem\n", "r = p.solve(solver, plot=1) # string argument is solver name\n", "\n", "# r.xf and r.ff are optim point and optim objFun value\n", "# r.ff should be something like 132.05\n", "print r.ff" ], "language": "python", "metadata": {}, "outputs": [] }, { "cell_type": "markdown", "metadata": {}, "source": [ " " ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "*Copyright 2014, Maruan Al-Shedivat and Yiannis Hadjimichael, ACM Student Chapter Members.*" ] } ], "metadata": {} } ] }