{ "cells": [ { "cell_type": "markdown", "metadata": {}, "source": [ "# Sum-of-Squares (SOS) optimization\n", "\n", "\n", "\n", "\n" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Background\n", "Sum-of-Squares (SOS) optimization is a special type of convex optimization. This type of optimization is formulated as a Semidefinite Programming (SDP) problem, which can be solved through several solvers supported by Drake's MathematicalProgram.\n", "\n", "A polynomial $p(x)$ is sum-of-squares if\n", "\\begin{align}\n", "p(x) = \\sum_i q_i(x)^2\n", "\\end{align}\n", "where each $q_i(x)$ is also a polynomial of the indeterminates $x$. $p(x)$ being sum-of-squares is a *sufficient* condition for $p(x)$ being non-negative for all $x$. In some cases it is also a necessary and sufficient condition. For more details on the relationship between sum-of-squares and non-negativity of a polynomial, you could refer to Chapter 3 of [Semidefinite Optimization and Convex Algebraic Geometry](http://www.mit.edu/~parrilo/sdocag/).\n", "\n", "A polynomial $p(x)$ being sum-of-squares is equivalent to the existence of a positive semidefinite (psd) matrix $Q$, such that\n", "\\begin{align}\n", "p(x) = z(x)^T Q z(x), Q\\succeq 0\n", "\\end{align}\n", "where $z(x)$ contains all the monomials of $x$ up to a certain degree. If $x\\in\\mathbb{R}^n$ and $p(x)$ has degree $2d$, then $z(x)$ contains all the monomials of $x$ up to degree $d$, namely the $z(x)$ has size ${{n+d}\\choose{d}}$. So searching for a sum-of-squares polynomial $p(x)$ is equivalent to searching for a positive semidefinite matrix $Q$, with the additional constraint that the polynomials $z(x)^TQz(x)$ and $p(x)$ are the same (they have the same coefficients).\n", "\n", "For example, for the polynomial\n", "\\begin{align}\n", "p(x) = 2x_1^4 + 5x_2^4 - 2x_1^2x_2^2 + 2x_1^3x_2 + 2x_1 + 2,\n", "\\end{align}\n", "we have\n", "\\begin{align}\n", "p(x) = \\underbrace{\\begin{bmatrix} 1 \\\\x_1 \\\\x_2 \\\\ x_1^2 \\\\ x_1x_2 \\\\ x_2^2\\end{bmatrix}^T}_{z(x)^T} \\underbrace{\\frac{1}{3}\\begin{bmatrix} 6 & 3 & 0 & -2 & 0 & -2\\\\ 3 & 4 & 0 & 0 & 0 & 0\\\\0 & 0 & 4 & 0 & 0 & 0\\\\-2 & 0 & 0 & 6 & 3 & -4\\\\ 0 & 0 & 0 & 3 & 5 & 0\\\\ -2 & 0 & 0 & -4 & 0 & 15\\end{bmatrix}}_{Q} \\underbrace{\\begin{bmatrix} 1 \\\\ x_1 \\\\x_2 \\\\x_1^2 \\\\x_1x_2\\\\ x_2^2\\end{bmatrix}}_{z(x)}\n", "\\end{align}\n", "where this $Q$ matrix is positive semidefinite, hence $p(x)$ is a sum-of-squares polynomial.\n", "In fact, $p(x)$ can be decomposed as sum-of-squares \n", "\\begin{align}\n", "p(x) = \\frac{4}{3}x_2^2 + \\frac{1349}{705}x_2^4 + \\frac{1}{12}(4x_1+3)^2+\\frac{1}{15}(3x_1^2+5x_1x_2)^2+\\frac{1}{315}(-21x_1^2+20x_2^2+10)^2+\\frac{1}{59220}(328x_2^2-235)^2\n", "\\end{align}" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Solving sum-of-squares problem\n", "In sum-of-squares problem, often we want to find a sum-of-squares polynomial that satisfies certain additional properties. Note that there are two different types of variables in sum-of-squares problems, the *indeterminates* and *decision variables*.\n", "* Decision variables: the coefficients of the polynomial are affine functions of our decision variables. We are optimizing over the decision variables.\n", "* Indeterminates: variables that can take any value. In the example above $x$ is an indeterminate -- the constraints that we write must hold $\\forall x$, we are *not* optimizing over these indeterminates.\n", "\n", "There are some functions commonly used in sum-of-squares optimization within Drake's MathematicalProgram. We list some of them here\n", "* `MathematicalProgram.NewIndeterminates` Declares and returns new indeterminates.\n", "* `MathematicalProgram.AddSosConstraint` Imposes sum-of-squares (SOS) constraint on a polynomial.\n", "* `MathematicalProgram.NewSosPolynomial` Creates and returns a new polynomial, and constrain this polynomial to be sum-of-squares (SOS).\n", "* `MathematicalProgram.NewFreePolynomial` Creates and returns a new polynomial. No constraint has been imposed on this polynomial yet.\n", "* `MathematicalProgram.AddEqualityConstraintBetweenPolynomials` imposes the linear constraint that two polynomials are equal (they have the same coefficients for matching monomials).\n", "\n", "We will describe the details of each function later. First let's look at some examples" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Imports" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "import math\n", "\n", "import numpy as np\n", "import matplotlib.pyplot as plt\n", "\n", "from pydrake.solvers.mathematicalprogram import MathematicalProgram, Solve\n", "import pydrake.symbolic as sym" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Example 1\n", "For example, if we want to solve the following optimization problem (Example 3.50 of [Semidefinite Optimization and Convex Algebraic Geometry](http://www.mit.edu/~parrilo/sdocag/))\n", "\\begin{align}\n", "\\min_{a}& -a_0 - a_1\\\\\n", "\\text{s.t }& x^4 + a_0 x + 2 + a_1 \\text{ is SOS}\\\\\n", "&(a_0 - a_1 + 1)x^2 + a_1x + 1\\text{ is SOS}\n", "\\end{align}\n", "in this case, $x$ is the indeterminate, and $a$ is the decision variables.\n", "\n", "The code below shows how to model and solve this SOS problem in drake's MathematicalProgram." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "\"\"\"\n", "Code for example 1.\n", "\"\"\"\n", "\n", "# Initialize an empty optimization program.\n", "prog = MathematicalProgram()\n", "# Declare \"a\" as decision variables.\n", "a = prog.NewContinuousVariables(2)\n", "# Declare \"x\" as indeterminates\n", "x = prog.NewIndeterminates(1)\n", "\n", "# Declare p1(x) = pow(x, 4) + a0 * x + 2 + a1. The second argument\n", "# [x] indicates that the indeterminates in the polynomial is x.\n", "p1 = sym.Polynomial(x[0]**4 + a[0] * x[0] + 2 + a[1], [x])\n", "# Declare p2(x) = (a0 - a1 + 1) * pow(x, 2) + a1 * x + 1. The second argument\n", "# [x] indicates that the indeterminates in the polynomial is x.\n", "p2 = sym.Polynomial((a[0] - a[1] + 1) * x[0] * x[0] + a[1] * x[0] + 1, [x])\n", "# Add the constraint that p1(x) and p2(x) are SOS\n", "prog.AddSosConstraint(p1)\n", "prog.AddSosConstraint(p2)\n", "\n", "# Add the cost -a0 - a1\n", "prog.AddCost(-a[0] - a[1])\n", "\n", "# Solve the problem.\n", "result = Solve(prog)\n", "\n", "# Retrieve the solution.\n", "print(result.get_solution_result())\n", "print(\"a =\", result.GetSolution(a))" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Example 2\n", "\\begin{align}\n", "\\min_{a, b, c}&\\; a + b + c\\\\\n", "\\text{s.t }& x^4 + ax^3 + bx^2 + cx + 1 \\ge 0 \\quad\\forall x\\in[0, 1]\n", "\\end{align}\n", "\n", "To solve this problem, we use a special case of Theorem 3.72 from\n", "[Semidefinite Optimization and Convex Algebraic Geometry](http://www.mit.edu/~parrilo/sdocag/):\n", "An even degree univariate polynomial $p(x)$ is nonnegative on interval $[0, 1]$ if and only if it can be written as\n", "\\begin{align}\n", "p(x) = s(x) + x(1-x)t(x)\n", "\\end{align}\n", "where $s(x), t(x)$ are both sum-of-squares.\n", "\n", "Hence we can reformulate the problem as\n", "\\begin{align}\n", "\\min_{a, b, c, t}&\\; a+b+c\\\\\n", "\\text{s.t }& x^4 + ax^3 + bx^2 + cx + 1 - x(1-x)t(x) \\text{ is SOS}\\\\\n", "& t(x) \\text{ is SOS}\n", "\\end{align}\n", "here $x$ is the indeterminate, $a, b, c$ and the coefficients of polynomial $t(x)$ are the decision variables. The code below shows how to model and solve this problem in drake's MathematicalProgram" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "\"\"\"\n", "Code for example 2.\n", "\"\"\"\n", "\n", "# Initialize an empty optimization program.\n", "prog = MathematicalProgram()\n", "# Declares indeterminates\n", "x = prog.NewIndeterminates(1)\n", "\n", "# Declares decision variable a, b, and c\n", "a = prog.NewContinuousVariables(1)[0]\n", "b = prog.NewContinuousVariables(1)[0]\n", "c = prog.NewContinuousVariables(1)[0]\n", "\n", "# Declare p(x), the second argument indicates this polynomial p(x) has x as the\n", "# indeterminates.\n", "poly_p = sym.Polynomial(x[0]**4 + a * x[0]**3 + b * x[0]** 2 + c * x[0] + 1, [x])\n", "# Declares SOS polynomial t(x), such that s(x) = p(x) - x(1-x) * t(x) is SOS.\n", "# The second return argument of NewSosPolynomial is the positive-semidefinite\n", "# Gramian matrix of poly_s. We ignore this matrix here.\n", "# t(x) should be a quadratic polynomial of x, hence the second input argument is\n", "# 2, indicating the degree of the returned polynomial.\n", "poly_t, _ = prog.NewSosPolynomial(sym.Variables(x), 2)\n", "# Compute s(x) = p(x) - x(1-x)t(x) as a polynomial of indeterminate x.\n", "poly_s = poly_p - sym.Polynomial(x[0] * (1 - x[0]), sym.Variables(x)) * poly_t\n", "prog.AddSosConstraint(poly_s)\n", "\n", "# Add the cost a + b + c\n", "prog.AddCost(a + b + c)\n", "\n", "result = Solve(prog)\n", "print(\"Is success? \", result.is_success())\n", "print(result.get_solution_result())\n", "a_val = result.GetSolution(a)\n", "b_val = result.GetSolution(b)\n", "c_val = result.GetSolution(c)\n", "\n", "# Now plot the polynomial p(x), that is always non-negative in the range [0, 1]\n", "x_val = np.linspace(0, 1, 100)\n", "y_val = np.power(x_val, 4) + a_val * np.power(x_val, 3) + b_val * np.power(x_val, 2) + c_val * x_val + 1\n", "plt.xlabel(\"x\")\n", "plt.ylabel(\"p(x)\")\n", "plt.plot(x_val, y_val)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Further readings\n", "\n", "Sum-of-squares optimization has been widely used in numerous fields, including control theory, robotics, quantum information theory, etc. Here are some references\n", "\n", "#### Sum-of-squares theory\n", "* [Semidefinite Optimization and Convex Algebraic Geometry](http://www.mit.edu/~parrilo/sdocag/) by G Blekherman, P Parrilo and R Thomas\n", "* [Sum of squares techniques and polynomial optimization](http://www.princeton.edu/~amirali/Public/Presentations/CDC_2016_Parrilo_1) Presentation slides by Pablo Parrilo\n", "\n", "#### System control theory\n", "* [Structured Semidefinite Programs and Semialgebraic Geometry Methods in Robustness and Optimization](http://www.mit.edu/~parrilo/pubs/files/thesis.pdf) Pablo Parrilo's PhD thesis\n", "* [Robust Region-of-Attraction Estimation](https://ieeexplore.ieee.org/abstract/document/5337881) Ufuk Topcu, Andrew Packard, Peter Seiler, Gary Balas, IEEE Transactions on Automatic Control, 2009\n", "\n", "#### Robotics\n", "* [LQR-trees: Feedback motion planning via sums-of-squares verification](http://groups.csail.mit.edu/robotics-center/public_papers/Tedrake10.pdf) Russ Tedrake, Ian Manchester, Mark Tobenkin, John Roberts, International Journal of Robotics Research, 2010\n", "* [Funnel libraries for real-time robust feedback motion planning](http://groups.csail.mit.edu/robotics-center/public_papers/Majumdar16.pdf) Anirudha Majumdar and Russ Tedrake, International Journal of Robotics Research, 2017" ] } ], "metadata": { "kernelspec": { "display_name": "Python 3", "language": "python", "name": "python3" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.6.9" } }, "nbformat": 4, "nbformat_minor": 1 }