scipy.optimize.fmin_ncg#
- scipy.optimize.fmin_ncg(f, x0, fprime, fhess_p=None, fhess=None, args=(), avextol=1e-05, epsilon=1.4901161193847656e-08, maxiter=None, full_output=0, disp=1, retall=0, callback=None, c1=0.0001, c2=0.9)[source]#
Unconstrained minimization of a function using the Newton-CG method.
- Parameters:
- fcallable
f(x, *args)
Objective function to be minimized.
- x0ndarray
Initial guess.
- fprimecallable
f'(x, *args)
Gradient of f.
- fhess_pcallable
fhess_p(x, p, *args)
, optional Function which computes the Hessian of f times an arbitrary vector, p.
- fhesscallable
fhess(x, *args)
, optional Function to compute the Hessian matrix of f.
- argstuple, optional
Extra arguments passed to f, fprime, fhess_p, and fhess (the same set of extra arguments is supplied to all of these functions).
- epsilonfloat or ndarray, optional
If fhess is approximated, use this value for the step size.
- callbackcallable, optional
An optional user-supplied function which is called after each iteration. Called as callback(xk), where xk is the current parameter vector.
- avextolfloat, optional
Convergence is assumed when the average relative error in the minimizer falls below this amount.
- maxiterint, optional
Maximum number of iterations to perform.
- full_outputbool, optional
If True, return the optional outputs.
- dispbool, optional
If True, print convergence message.
- retallbool, optional
If True, return a list of results at each iteration.
- c1float, default: 1e-4
Parameter for Armijo condition rule.
- c2float, default: 0.9
Parameter for curvature condition rule
- fcallable
- Returns:
- xoptndarray
Parameters which minimize f, i.e.,
f(xopt) == fopt
.- foptfloat
Value of the function at xopt, i.e.,
fopt = f(xopt)
.- fcallsint
Number of function calls made.
- gcallsint
Number of gradient calls made.
- hcallsint
Number of Hessian calls made.
- warnflagint
Warnings generated by the algorithm. 1 : Maximum number of iterations exceeded. 2 : Line search failure (precision loss). 3 : NaN result encountered.
- allvecslist
The result at each iteration, if retall is True (see below).
See also
minimize
Interface to minimization algorithms for multivariate functions. See the ‘Newton-CG’ method in particular.
Notes
Only one of fhess_p or fhess need to be given. If fhess is provided, then fhess_p will be ignored. If neither fhess nor fhess_p is provided, then the hessian product will be approximated using finite differences on fprime. fhess_p must compute the hessian times an arbitrary vector. If it is not given, finite-differences on fprime are used to compute it.
Newton-CG methods are also called truncated Newton methods. This function differs from scipy.optimize.fmin_tnc because
- scipy.optimize.fmin_ncg is written purely in Python using NumPy
and scipy while scipy.optimize.fmin_tnc calls a C function.
- scipy.optimize.fmin_ncg is only for unconstrained minimization
while scipy.optimize.fmin_tnc is for unconstrained minimization or box constrained minimization. (Box constraints give lower and upper bounds for each variable separately.)
Parameters c1 and c2 must satisfy
0 < c1 < c2 < 1
.References
Wright & Nocedal, ‘Numerical Optimization’, 1999, p. 140.