{ "cells": [ { "cell_type": "markdown", "metadata": {}, "source": [ "# `NLopt` with autodiff and numerical gradients\n", "Presented by Chiyoung Ahn (https://github.com/chiyahn)\n", "\n", "This notebook demonstrates how nonlinear optimization problems in `NLopt` can be solved without the need of specifying analytic formulae for gradients by autodifferentiation from `ForwardDiff` and `Flux` or numerical gradients." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "] add NLopt BenchmarkTools ForwardDiff NLSolversBase DiffResults Flux" ] }, { "cell_type": "code", "execution_count": 1, "metadata": { "scrolled": true }, "outputs": [], "source": [ "using NLopt, BenchmarkTools, ForwardDiff, NLSolversBase, DiffResults, Flux\n", "using Flux.Tracker: gradient_" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Using vanilla `NLopt`" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Nonlinear optimization without nonlinear constraints:" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "First, define the objective function `f` and the corresponding gradient `g!`. In `NLopt`, evaluation of `f` and execution of `g!` take place in the same time:" ] }, { "cell_type": "code", "execution_count": 2, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "fg! (generic function with 1 method)" ] }, "execution_count": 2, "metadata": {}, "output_type": "execute_result" } ], "source": [ "# call g! then return f(x)\n", "function fg!(x::Vector, grad::Vector)\n", " if length(grad) > 0 # gradient of f(x)\n", " grad[1] = -2*x[1]*(x[1]^2 + x[2]^2)\n", " grad[2] = -2*x[2]*(x[1]^2 + x[2]^2)\n", " end\n", " return -(x[1]^2 + x[2]^2)\n", "end" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "and the corresponding optimization problem:" ] }, { "cell_type": "code", "execution_count": 3, "metadata": {}, "outputs": [], "source": [ "opt = Opt(:LD_LBFGS, 2) # 2 indicates the length of `x`\n", "lower_bounds!(opt, [-1.0, -1.0]) # find `x` above -2.0\n", "upper_bounds!(opt, [2.0, 2.0]) # find `x` below 2.0\n", "min_objective!(opt, fg!) # specifies that optimization problem is on minimization" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "and solve it!" ] }, { "cell_type": "code", "execution_count": 4, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ " 3.429 ms (21 allocations: 976 bytes)\n", "got -8.0 at [2.0, 2.0] after 2 iterations (returned SUCCESS)\n" ] } ], "source": [ "(minf,minx,ret) = @btime optimize($opt, [1.0, 1.0])\n", "numevals = opt.numevals # the number of function evaluations\n", "println(\"got $minf at $minx after $numevals iterations (returned $ret)\")" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Nonlinear optimization with nonlinear constraints:" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Define `fg!` first:" ] }, { "cell_type": "code", "execution_count": 5, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "fg! (generic function with 1 method)" ] }, "execution_count": 5, "metadata": {}, "output_type": "execute_result" } ], "source": [ "function fg!(x::Vector, grad::Vector)\n", " if length(grad) > 0 # gradient of f(x)\n", " grad[1] = 0\n", " grad[2] = 0.5/sqrt(x[2])\n", " end\n", " return sqrt(x[2]) # f(x)\n", "end" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "and the corresponding optimization problem:" ] }, { "cell_type": "code", "execution_count": 6, "metadata": {}, "outputs": [], "source": [ "opt = Opt(:LD_SLSQP, 2) # 2 indicates the length of `x`\n", "lower_bounds!(opt, [-Inf, 0.]) # forces `x` to have a non-negative value\n", "min_objective!(opt, fg!) # specifies that optimization problem is on minimization\n", "xtol_rel!(opt,1e-4) # set a lower relative xtol for convergence criteria" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Similarly for constraint, where `constraint_f(x) <= 0` is imposed for all `x` " ] }, { "cell_type": "code", "execution_count": 7, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "constraint_fg! (generic function with 1 method)" ] }, "execution_count": 7, "metadata": {}, "output_type": "execute_result" } ], "source": [ "function constraint_f(x::Vector, a, b)\n", " (a*x[1] + b)^3 - x[2] # constraint_f(x); constraint_f(x) <= 0 is imposed\n", "end\n", "\n", "function constraint_g!(x::Vector, grad::Vector, a, b)\n", " grad[1] = 3a * (a*x[1] + b)^2\n", " grad[2] = -1\n", "end\n", "\n", "function constraint_fg!(x::Vector, grad::Vector, a, b)\n", " if length(grad) > 0 # gradient of constraint_f(x)\n", " constraint_g!(x, grad, a, b)\n", " end\n", " return constraint_f(x, a, b)\n", "end" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Here, `a` and `b` are added to allow variants of `constraint_f(x)` in a handy way. For instance, to impose\n", "```julia\n", "(2*x[1] + 0)^3 - x[2] <= 0\n", "```\n", "AND\n", "```julia\n", "(-1*x[1] + 1)^3 - x[2] <= 0\n", "```\n", "one can simply run the following two lines:" ] }, { "cell_type": "code", "execution_count": 8, "metadata": {}, "outputs": [], "source": [ "inequality_constraint!(opt, (x,g) -> constraint_fg!(x,g,2,0), 1e-8)\n", "inequality_constraint!(opt, (x,g) -> constraint_fg!(x,g,-1,1), 1e-8)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Ready to roll out:" ] }, { "cell_type": "code", "execution_count": 9, "metadata": { "scrolled": false }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ " 14.756 μs (242 allocations: 9.03 KiB)\n", "got 0.5443310539518157 at [0.333333, 0.296296] after 13 iterations (returned XTOL_REACHED)\n" ] } ], "source": [ "(minf,minx,ret) = @btime optimize($opt, [1.234, 5.678])\n", "numevals = opt.numevals # the number of function evaluations\n", "println(\"got $minf at $minx after $numevals iterations (returned $ret)\")" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "#### With vectorized constraints" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "It might be desirable to have a vector-valued constraint function to have all constraints at once instead of manually adding each constraint function. This can be done by calling `inequality_constraint` with a vectorized `tol` parameter that has the same length as a constraint function. For instance, the above two constraints\n", "\n", "```julia\n", "inequality_constraint!(opt, (x,g) -> constraint_fg!(x,g,2,0), 1e-8)\n", "inequality_constraint!(opt, (x,g) -> constraint_fg!(x,g,-1,1), 1e-8)\n", "```\n", "\n", "can be instead added by:" ] }, { "cell_type": "code", "execution_count": 10, "metadata": {}, "outputs": [], "source": [ "function fg!(x::Vector, grad::Vector)\n", " if length(grad) > 0 # gradient of f(x)\n", " grad[1] = 0\n", " grad[2] = 0.5/sqrt(x[2])\n", " end\n", " return sqrt(x[2]) # f(x)\n", "end\n", "\n", "opt = Opt(:LD_SLSQP, 2) # 2 indicates the length of `x`\n", "lower_bounds!(opt, [-Inf, 0.]) # forces `x` to have a non-negative value\n", "min_objective!(opt, fg!) # specifies that optimization problem is on minimization\n", "xtol_rel!(opt,1e-4) # set a lower relative xtol for convergence criteria\n", "\n", "# define a vectorized constraint\n", "function constraints_fg!(result, x, jacobian_t, a, b)\n", " if length(jacobian_t) > 0 # transpose of the Jacobian matrix\n", " jacobian_t[1,1] = 3a[1] * (a[1]*x[1] + b[1])^2\n", " jacobian_t[2,1] = -1\n", " jacobian_t[1,2] = 3a[2] * (a[2]*x[1] + b[2])^2\n", " jacobian_t[2,2] = -1\n", " end\n", " result[:] = [constraint_f(x,a[1],b[1]); \n", " constraint_f(x,a[2],b[2])]\n", "end\n", "\n", "# add a vectorized constraint\n", "inequality_constraint!(opt, (result, x, jacobian_t) -> constraints_fg!(result, x, jacobian_t, [2; -1], [0; 1]), \n", " [1e-8; 1e-8])" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Note that `gradient` field is now replaced by the **transpose** of a Jacobian matrix as the constraint function is now vector-valued.\n", "\n", "Running optimization routine yields the identical solution as above:" ] }, { "cell_type": "code", "execution_count": 11, "metadata": { "scrolled": false }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ " 13.473 μs (203 allocations: 10.86 KiB)\n", "got 0.5443310539518157 at [0.333333, 0.296296] after 13 iterations (returned XTOL_REACHED)\n" ] } ], "source": [ "(minf,minx,ret) = @btime optimize($opt, [1.234, 5.678])\n", "numevals = opt.numevals # the number of function evaluations\n", "println(\"got $minf at $minx after $numevals iterations (returned $ret)\")" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Using `NLopt` without analytic formulae for gradients\n", "\n", "Suppose that `f` you want to optimize is written in a fairly complex form and you do not have an access to the analytic formula of the gradient of `f`. Here is a solution using automatic differentiation:" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Automatic differentiation, with `ForwardDiff`" ] }, { "cell_type": "code", "execution_count": 12, "metadata": { "scrolled": true }, "outputs": [ { "data": { "text/plain": [ "fg! (generic function with 1 method)" ] }, "execution_count": 12, "metadata": {}, "output_type": "execute_result" } ], "source": [ "function f(x)\n", " return -(x[1]^2 + x[2])^2 \n", "end\n", "\n", "# compute gradient by forward automatic differentiation\n", "function g!(G::Vector, x::Vector)\n", " ForwardDiff.gradient!(G, f, x)\n", "end\n", "\n", "function fg!(x::Vector, grad::Vector)\n", " if length(grad) > 0 # gradient of f(x)\n", " g!(grad, x)\n", " end\n", " f(x)\n", "end" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Solve:" ] }, { "cell_type": "code", "execution_count": 13, "metadata": { "scrolled": true }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ " 3.536 ms (32 allocations: 1.67 KiB)\n", "got -36.0 at [2.0, 2.0] after 3 iterations (returned SUCCESS)\n" ] } ], "source": [ "# define the optimization problem\n", "opt = Opt(:LD_LBFGS, 2) # 2 indicates the length of `x`\n", "lower_bounds!(opt, [-1.0, -1.0]) # find `x` above -2.0\n", "upper_bounds!(opt, [2.0, 2.0]) # find `x` below 2.0\n", "min_objective!(opt, fg!) # specifies that optimization problem is on minimization\n", "\n", "# solve the optimization problem\n", "(minf,minx,ret) = @btime optimize($opt, [1.0, 1.0])\n", "numevals = opt.numevals # the number of function evaluations\n", "println(\"got $minf at $minx after $numevals iterations (returned $ret)\")" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Since `g!` requires evaluation of `f` as well, it might be redundant to compute both values separately. In case evaluating `f` requires a huge amount of computation, one can alternatively use `DiffResults` to call the saved values:" ] }, { "cell_type": "code", "execution_count": 14, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "fg! (generic function with 1 method)" ] }, "execution_count": 14, "metadata": {}, "output_type": "execute_result" } ], "source": [ "function fg!(x::Vector, grad::Vector)\n", " result = DiffResults.GradientResult(x) # gen a result object\n", " result = ForwardDiff.gradient!(result, f, x) # update\n", " grad[:] = DiffResults.gradient(result) # run g!(x)\n", " return DiffResults.value(result) # return f(x)\n", "end" ] }, { "cell_type": "code", "execution_count": 15, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ " 3.456 ms (41 allocations: 2.09 KiB)\n", "got -36.0 at [2.0, 2.0] after 3 iterations (returned SUCCESS)\n" ] } ], "source": [ "# define the optimization problem\n", "opt = Opt(:LD_LBFGS, 2) # 2 indicates the length of `x`\n", "lower_bounds!(opt, [-1.0, -1.0]) # find `x` above -2.0\n", "upper_bounds!(opt, [2.0, 2.0]) # find `x` below 2.0\n", "min_objective!(opt, fg!) # specifies that optimization problem is on minimization\n", "\n", "# solve the optimization problem\n", "(minf,minx,ret) = @btime optimize($opt, [1.0, 1.0])\n", "numevals = opt.numevals # the number of function evaluations\n", "println(\"got $minf at $minx after $numevals iterations (returned $ret)\")" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Automatic differentiation, with `Flux`" ] }, { "cell_type": "code", "execution_count": 16, "metadata": { "scrolled": true }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ " 3.481 ms (146 allocations: 5.28 KiB)\n", "got -36.0 after 3 iterations (returned SUCCESS)\n" ] } ], "source": [ "function fg!(x::Vector, grad::Vector)\n", " val = f(x)\n", " grad[:] = gradient_(f, x)[1]\n", " return val\n", "end\n", "# define the optimization problem\n", "opt = Opt(:LD_LBFGS, 2) # 2 indicates the length of `x`\n", "lower_bounds!(opt, [-1.0, -1.0]) # find `x` above -2.0\n", "upper_bounds!(opt, [2.0, 2.0]) # find `x` below 2.0\n", "min_objective!(opt, fg!) # specifies that optimization problem is on minimization\n", "\n", "# solve the optimization problem\n", "(minf,minx,ret) = @btime optimize($opt, [1.0, 1.0])\n", "numevals = opt.numevals # the number of function evaluations\n", "println(\"got $minf after $numevals iterations (returned $ret)\")" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Integration with `NLSolversBase` interface" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Add the following definition for `NLoptAdapter` to exploit the native support for numerical derivatives and autodifferentiation from `NLSolversBase`:" ] }, { "cell_type": "code", "execution_count": 17, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "NLoptAdapter" ] }, "execution_count": 17, "metadata": {}, "output_type": "execute_result" } ], "source": [ "struct NLoptAdapter{T} <: Function where T <: AbstractObjective\n", " nlsolver_base::T\n", "end\n", "\n", "# implement fg!; note that the order is reversed\n", "(adapter::NLoptAdapter)(x, df) = adapter.nlsolver_base.fdf(df, x)\n", "(adapter::NLoptAdapter)(result, x, jacobian_transpose) = adapter.nlsolver_base.fdf(result, jacobian_transpose', x)\n", "\n", "# constructors\n", "NLoptAdapter(f, x, autodiff = :forward) = NLoptAdapter(OnceDifferentiable(f, x, autodiff = autodiff))\n", "NLoptAdapter(f!, x::Vector, F::Vector, autodiff = :forward) = NLoptAdapter(OnceDifferentiable(f!, x, F, autodiff = autodiff))" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Let's roll out:" ] }, { "cell_type": "code", "execution_count": 18, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ " 3.507 ms (146 allocations: 5.28 KiB)\n", "got -36.0 at [2.0, 2.0] after 3 iterations (returned SUCCESS)\n" ] } ], "source": [ "f_opt = NLoptAdapter(x -> -(x[1]^2 + x[2])^2, zeros(2), :forward)\n", "\n", "# define the optimization problem\n", "opt = Opt(:LD_LBFGS, 2) # 2 indicates the length of `x`\n", "lower_bounds!(opt, [-1.0, -1.0]) # find `x` above -2.0\n", "upper_bounds!(opt, [2.0, 2.0]) # find `x` below 2.0\n", "min_objective!(opt, fg!) # specifies that optimization problem is on minimization\n", "\n", "# solve the optimization problem\n", "(minf,minx,ret) = @btime optimize($opt, [1.0, 1.0])\n", "numevals = opt.numevals # the number of function evaluations\n", "println(\"got $minf at $minx after $numevals iterations (returned $ret)\")" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Compare this with the performance from automatic differentiation example above." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### With nonlinear constraints" ] }, { "cell_type": "code", "execution_count": 19, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ " 100.090 μs (271 allocations: 9.17 KiB)\n", "got 0.5443310477213124 at [0.333333, 0.296296] after 11 iterations (returned XTOL_REACHED)\n" ] } ], "source": [ "myfunc(x) = sqrt(x[2])\n", "x0 = [1.234, 5.678]\n", "function myconstraint(x, a, b) \n", " (a*x[1] + b)^3 - x[2]\n", "end\n", "\n", "# define objective and constraint, using NLoptAdapter\n", "f_opt = NLoptAdapter(myfunc, x0)\n", "c_1_opt = NLoptAdapter(x -> myconstraint(x,2,0), x0)\n", "c_2_opt = NLoptAdapter(x -> myconstraint(x,-1,1), x0)\n", "\n", "# define the optimization problem\n", "opt = Opt(:LD_MMA, 2)\n", "lower_bounds!(opt, [-Inf, 0.])\n", "xtol_rel!(opt,1e-4)\n", "\n", "min_objective!(opt, f_opt)\n", "inequality_constraint!(opt, c_1_opt, 1e-8)\n", "inequality_constraint!(opt, c_2_opt, 1e-8)\n", "\n", "# solve\n", "(minf,minx,ret) = @btime optimize($opt, $x0)\n", "numevals = opt.numevals # the number of function evaluations\n", "println(\"got $minf at $minx after $numevals iterations (returned $ret)\")" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Works for central-difference methods too:" ] }, { "cell_type": "code", "execution_count": 20, "metadata": { "scrolled": true }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ " 99.769 μs (205 allocations: 7.63 KiB)\n", "got 0.5443310476210945 at [0.333333, 0.296296] after 11 iterations (returned XTOL_REACHED)\n" ] } ], "source": [ "# define objective and constraint, using NLoptAdapter\n", "f_opt = NLoptAdapter(myfunc, x0, :central)\n", "c_1_opt = NLoptAdapter(x -> myconstraint(x,2,0), x0, :central)\n", "c_2_opt = NLoptAdapter(x -> myconstraint(x,-1,1), x0, :central)\n", "\n", "# define the optimization problem\n", "opt = Opt(:LD_MMA, 2)\n", "lower_bounds!(opt, [-Inf, 0.])\n", "xtol_rel!(opt,1e-4)\n", "\n", "min_objective!(opt, f_opt)\n", "inequality_constraint!(opt, c_1_opt, 1e-8)\n", "inequality_constraint!(opt, c_2_opt, 1e-8)\n", "\n", "# solve\n", "(minf,minx,ret) = @btime optimize($opt, $x0)\n", "numevals = opt.numevals # the number of function evaluations\n", "println(\"got $minf at $minx after $numevals iterations (returned $ret)\")" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### With vectorized constraints" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "When vectorized constraints are passed, `myconstraints!` should be used for assignment of evaluated constraints, rather than return them." ] }, { "cell_type": "code", "execution_count": 21, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "(::NLoptAdapter{OnceDifferentiable{Array{Float64,1},Array{Float64,2},Array{Float64,1}}}) (generic function with 2 methods)" ] }, "execution_count": 21, "metadata": {}, "output_type": "execute_result" } ], "source": [ "function myconstraints!(F, x) \n", " F[:] = [myconstraint(x,2,0); myconstraint(x,-1,1)]\n", "end\n", "\n", "# define objective and constraint, using NLoptAdapter\n", "f_opt = NLoptAdapter(myfunc, x0, :central)\n", "c_opt = NLoptAdapter(myconstraints!, x0, zeros(2), :central) # 2 is the length of myconstraints" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "The rest of the procedure is similar:" ] }, { "cell_type": "code", "execution_count": 22, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ " 98.806 μs (205 allocations: 11.41 KiB)\n", "got 0.5443310476210945 at [0.333333, 0.296296] after 11 iterations (returned XTOL_REACHED)\n" ] } ], "source": [ "# define the optimization problem\n", "opt = Opt(:LD_MMA, 2)\n", "lower_bounds!(opt, [-Inf, 0.])\n", "xtol_rel!(opt,1e-4)\n", "\n", "min_objective!(opt, f_opt)\n", "inequality_constraint!(opt, c_opt, fill(1e-8, 2))\n", "\n", "# solve\n", "(minf,minx,ret) = @btime optimize($opt, $x0)\n", "numevals = opt.numevals # the number of function evaluations\n", "println(\"got $minf at $minx after $numevals iterations (returned $ret)\")" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### With derivative-free methods" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "The wrapper works for derivative-free methods too. First, consider the following vanilla `nlopt` implementation:" ] }, { "cell_type": "code", "execution_count": 23, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "got 0.0 at [0.0, 0.0] after 11 iterations (returned XTOL_REACHED)\n" ] } ], "source": [ "function myfunc(x::Vector, grad::Vector)\n", " return sqrt(x[1]^2 + x[2]^2)\n", "end\n", "\n", "opt = Opt(:LN_NELDERMEAD, 2)\n", "lower_bounds!(opt, [0.0, 0.0])\n", "xtol_rel!(opt,1e-4)\n", "\n", "min_objective!(opt, myfunc)\n", "\n", "(minf,minx,ret) = optimize(opt, [1.234, 5.678])\n", "numevals = opt.numevals # the number of function evaluations\n", "println(\"got $minf at $minx after $numevals iterations (returned $ret)\")" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Here's one using `NLoptAdapter`:" ] }, { "cell_type": "code", "execution_count": 24, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "got 0.0 at [0.0, 0.0] after 11 iterations (returned XTOL_REACHED)\n" ] } ], "source": [ "f_opt = NLoptAdapter(x -> sqrt(x[1]^2 + x[2]^2), x0, :central)\n", "\n", "opt = Opt(:LN_NELDERMEAD, 2)\n", "lower_bounds!(opt, [0.0, 0.0])\n", "xtol_rel!(opt,1e-4)\n", "\n", "min_objective!(opt, f_opt)\n", "\n", "(minf,minx,ret) = optimize(opt, [1.234, 5.678])\n", "numevals = opt.numevals # the number of function evaluations\n", "println(\"got $minf at $minx after $numevals iterations (returned $ret)\")" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "which returns the identical result." ] } ], "metadata": { "kernelspec": { "display_name": "Julia 1.0.0", "language": "julia", "name": "julia-1.0" }, "language_info": { "file_extension": ".jl", "mimetype": "application/julia", "name": "julia", "version": "1.0.0" } }, "nbformat": 4, "nbformat_minor": 2 }