SciPy global minimum curve fit
I'm using scipy.optimize.curve_fit
, but I suspect it is converging to a local minimum and not the global minimum.
I tried using simulated annealing in the following way:
def fit(params):
return np.sum((ydata - specf(xdata,*params))**2)
p = scipy.optimize.anneal(fit,[1000,1E-10])
where specf
is the curve I am trying to fit. The results in p
though are clearly worse than the minimum returned by curve_fit
even when the return value indicates the global minimum was reached (see anneal).
开发者_高级运维How can I improve the results? Is there a global curve fitter in SciPy?
You're right, it only converges towards a local minimum (when it converges) since it uses the Levenburg-Marquardt algorithm. There is no global curve fitter in SciPy, you have to write you own using the existing global optimizers . But be aware, that this still don't have to converge to the value you want. That's impossible in most cases.
The only method to improve your result is to guess the starting parameters quite well.
You might want to try using leastsq() (curve_fit actually uses this, but you dont get the full output) or the ODR package instead of curve_fit.
The full output of leastsq() gives you a lot more information, such as the chisquared value (if you want to use that as a quick and dirty goodness of fit test).
If you need to weight the fit you can just that this way:
fitfunc = lambda p,x: p[0]+ p[1]*exp(-x)
errfunc = lambda p, x, y, xerr: (y-fitfunc(p,x))/xerr
out = leastsq(errfunc, pinit, args=(x,y, xerr), full_output=1)
chisq=sum(infodict['fvec']*infodict['fvec'])
This is a nontrivial problem. Have you considered using Evolutionary Strategies? I have had great success with ecspy (see http://code.google.com/p/ecspy/) and the community is small but very helpful.
精彩评论