Skip to main content
Ctrl+K
SciPy vinteractive-docs-deploy Manual - Home
  • Installing
  • User Guide
  • API reference
  • Building from source
  • Development
  • Release notes
  • GitHub
  • Twitter
  • Installing
  • User Guide
  • API reference
  • Building from source
  • Development
  • Release notes
  • GitHub
  • Twitter

Section Navigation

  • scipy
  • scipy.cluster
  • scipy.constants
  • scipy.datasets
  • scipy.fft
  • scipy.fftpack
  • scipy.integrate
  • scipy.interpolate
  • scipy.io
  • scipy.linalg
  • scipy.misc
  • scipy.ndimage
  • scipy.odr
  • scipy.optimize
  • scipy.signal
  • scipy.sparse
  • scipy.spatial
  • scipy.special
  • scipy.stats
  • SciPy API
  • Optimization and root finding (scipy.optimize)
  • scipy.optimi...

scipy.optimize.line_search#

scipy.optimize.line_search(f, myfprime, xk, pk, gfk=None, old_fval=None, old_old_fval=None, args=(), c1=0.0001, c2=0.9, amax=None, extra_condition=None, maxiter=10)[source]#

Find alpha that satisfies strong Wolfe conditions.

Parameters:
fcallable f(x,*args)

Objective function.

myfprimecallable f’(x,*args)

Objective function gradient.

xkndarray

Starting point.

pkndarray

Search direction. The search direction must be a descent direction for the algorithm to converge.

gfkndarray, optional

Gradient value for x=xk (xk being the current parameter estimate). Will be recomputed if omitted.

old_fvalfloat, optional

Function value for x=xk. Will be recomputed if omitted.

old_old_fvalfloat, optional

Function value for the point preceding x=xk.

argstuple, optional

Additional arguments passed to objective function.

c1float, optional

Parameter for Armijo condition rule.

c2float, optional

Parameter for curvature condition rule.

amaxfloat, optional

Maximum step size

extra_conditioncallable, optional

A callable of the form extra_condition(alpha, x, f, g) returning a boolean. Arguments are the proposed step alpha and the corresponding x, f and g values. The line search accepts the value of alpha only if this callable returns True. If the callable returns False for the step length, the algorithm will continue with new iterates. The callable is only called for iterates satisfying the strong Wolfe conditions.

maxiterint, optional

Maximum number of iterations to perform.

Returns:
alphafloat or None

Alpha for which x_new = x0 + alpha * pk, or None if the line search algorithm did not converge.

fcint

Number of function evaluations made.

gcint

Number of gradient evaluations made.

new_fvalfloat or None

New function value f(x_new)=f(x0+alpha*pk), or None if the line search algorithm did not converge.

old_fvalfloat

Old function value f(x0).

new_slopefloat or None

The local slope along the search direction at the new value <myfprime(x_new), pk>, or None if the line search algorithm did not converge.

Notes

Uses the line search algorithm to enforce strong Wolfe conditions. See Wright and Nocedal, ‘Numerical Optimization’, 1999, pp. 59-61.

The search direction pk must be a descent direction (e.g. -myfprime(xk)) to find a step length that satisfies the strong Wolfe conditions. If the search direction is not a descent direction (e.g. myfprime(xk)), then alpha, new_fval, and new_slope will be None.

Examples

>>> import numpy as np
>>> from scipy.optimize import line_search

A objective function and its gradient are defined.

>>> def obj_func(x):
...     return (x[0])**2+(x[1])**2
>>> def obj_grad(x):
...     return [2*x[0], 2*x[1]]

We can find alpha that satisfies strong Wolfe conditions.

>>> start_point = np.array([1.8, 1.7])
>>> search_gradient = np.array([-1.0, -1.0])
>>> line_search(obj_func, obj_grad, start_point, search_gradient)
(1.0, 2, 1, 1.1300000000000001, 6.13, [1.6, 1.4])

previous

scipy.optimize.bracket

next

scipy.optimize.LbfgsInvHessProduct

On this page
  • line_search

© Copyright 2008-2024, The SciPy community.

Created using Sphinx 5.3.0.

Built with the PyData Sphinx Theme 0.15.2.