scipy.optimize.brent

scipy.optimize.brent(func, args=(), brack=None, tol=1.48e-08, full_output=0, maxiter=500)[source]

Given a function of one-variable and a possible bracket, return the local minimum of the function isolated to a fractional precision of tol.

Parameters:

func : callable f(x,*args)

Objective function.

args : tuple, optional

Additional arguments (if present).

brack : tuple, optional

Either a triple (xa,xb,xc) where xa<xb<xc and func(xb) < func(xa), func(xc) or a pair (xa,xb) which are used as a starting interval for a downhill bracket search (see bracket). Providing the pair (xa,xb) does not always mean the obtained solution will satisfy xa<=x<=xb.

tol : float, optional

Stop if between iteration change is less than tol.

full_output : bool, optional

If True, return all output args (xmin, fval, iter, funcalls).

maxiter : int, optional

Maximum number of iterations in solution.

Returns:

xmin : ndarray

Optimum point.

fval : float

Optimum value.

iter : int

Number of iterations.

funcalls : int

Number of objective function evaluations made.

See also

minimize_scalar
Interface to minimization algorithms for scalar univariate functions. See the ‘Brent’ method in particular.

Notes

Uses inverse parabolic interpolation when possible to speed up convergence of golden section method.

Does not ensure that the minimum lies in the range specified by brack. See fminbound.

Examples

We illustrate the behaviour of the function when brack is of size 2 and 3 respectively. In the case where brack is of the form (xa,xb), we can see for the given values, the output need not necessarily lie in the range (xa,xb).

>>> def f(x):
...     return x**2
>>> from scipy import optimize
>>> minimum = optimize.brent(f,brack=(1,2))
>>> minimum
0.0
>>> minimum = optimize.brent(f,brack=(-1,0.5,2))
>>> minimum
-2.7755575615628914e-17