开发者

When i apply approx_fprime to scipy.minimize, it doesn't iterate

I tried using minimize function in scipy packages like below code When I use jac option = approx_fprime, iteration is 0 and optimization doesn't work. But When I use jac option = rosen_der, it worked!

Thank you for reading

`

import numpy as np
from scipy.optimize import minimize, approx_fprime


def rosen(x):
    """The Rosenbrock function"""
    return sum(100.0*(x[1:]-x[:-1]**2.0)**2.0 + (1-x[:-1])**2.0)

def rosen_der(x): 
    # derivative of rosenbrock function
    xm = x[1:-1]
    xm_m1 = x[:-2]
    xm_p1 = x[2:]
    der = np.zeros_like(x)
    der[1:-1] = 200*(xm-xm_m1**2) - 400*(xm_p1 - xm**2)*xm - 2*(1-xm)
    der[0] = -400*x[0]*(x[1]-x[0]**2) - 2*(1-x[0])
    der[-1] = 200*(x[-1]-x[-2]**2)
    return d开发者_Go百科er


x0=np.array([1.3, 0.7])
eps = np.sqrt(np.finfo(float).eps)
fprime = lambda x : np.array(approx_fprime(x0, rosen, eps))
res = minimize(rosen, x0, method='CG', jac=fprime, options={'maxiter':10, 'disp': True})
print(res.x)
[ 515.40001106 -197.99999905]
[ 515.4 -198. ]
98.10000000000005
Warning: Desired error not necessarily achieved due to precision loss.
         Current function value: 98.100000
         Iterations: 0
         Function evaluations: 33
         Gradient evaluations: 21
[1.3 0.7]

`

I checked approx_fprime is ndarray, same rosen_der, and value is same too. Why optimization doesn't work??


Your function fprime is a function of x but approximates the derivative at x0. Consequently, you're evaluating the gradient at the initial guess x0 in every iteration. You should evaluate/approximate the derivative at x instead:

fprime = lambda x : approx_fprime(x, rosen, eps)

Note that approx_fprime already returns an np.ndarray, so there's no need for the extra np.array call.

It's also worth mentioning that you don't need to pass approximated derivatives as minimize approximates them by finite differences by default once you don't pass any derivatives, i.e. jac=None. However, minimize uses approx_derivative under the hood instead of approx_fprime as it provides support for evaluating derivatives at variable bounds.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜