开发者

Getting lambda in a Box-Cox equation

I have开发者_运维百科 a set of values and I need to get the lambda for a Box-Cox equation. It's a normal curve (gaussian distribution). Does anyone know how to get optimal value for lambda in R, C#, MATLAB, or python or perl?


In R: package geoR, boxcox.fit


boxcox in the MASS package


Scipy has a lot of good curve fitting modules. There is a cookbook recipe for Linear Regression. If you need something more complex, there is an optimize package.


I'm surprised the following packages are not listed in any of answers: You can use boxcax from scipy.stats in Python or in R you can use Boxcox in Caret package. You can read more about resolving skewness for predictive modeling in this post.


If your data is Gaussian (as you state in your question), then the optimal value of lambda is 1, i.e. it doesn't need to be transformed.


Perl's PDL has a gaussian fit routine. PDL is a lot like Matlab except with the power of programming in Perl.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜