开发者

How Long Does 1 Multiplication Operation Take Between Two Doubles?

I have a process I need to optimize and I was开发者_如何学Go wondering how long a multiplication operation takes between two doubles. If I can cut off 1000 of these, I want to know if it will actually make a difference in the overall performance of my process?


This is highly system specific. On my system, it only takes a few milliseconds to do 10 million multiplication operations. Removing 1000 is probably not going to be noticeable.

If you really want to optimize your routine, this isn't the best approach. The better approach is to profile it, and find the bottleneck in your current implementation (which will likely not be what you expect). Then look at that bottleneck, and try to come up with a better algorithm. Focus on overall algorithms first, and optimize those.

If it's still too slow, then you can start trying to optimize the actual routine in the slower sections or the ones called many times, first.

The only effective means of profiling is to measure first (and after!).


That entirely depends on the size of the factors. I can do single-digit multiplication (e.g. 7×9) in my head in a fraction of a second, whereas it would take me a few minutes to compute 365286×475201.


Modern Intel CPU's do in the 10's of billions of floating point multiplies per second. I wouldn't worry about 1000 if I were you.

Intel doc showing FLOP performance of their CPUs


this depends on various things like, the cpu you are using, the other processes currently running, what the jit does ...

the only reliable method to get an answer to this question is using a profiler and meassuring the effect of your optimization

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜