开发者

Is there a way to correctly multiply two 32 bit integers in Javascript?

Is there a way to correctly multiply two 32 bit integers in Javascript?

When I try this from C using long long I get this:

pri开发者_高级运维ntf("0x%llx * %d = %llx\n", 0x4d98ee96ULL, 1812433253,
      0x4d98ee96ULL * 1812433253);
==> 0x4d98ee96 * 1812433253 = 20becd7b431e672e

But from Javascript the result is different:

x = 0x4d98ee97 * 1812433253;
print("0x4d98ee97 * 1812433253 = " + x.toString(16));
==> 0x4d98ee97 * 1812433253 = 20becd7baf25f000

The trailing zeros lead me to suspect that Javascript has an oddly limited integer resolution somewhere between 32 and 64 bits.

Is there a way to get a correct answer? (I'm using Mozilla js-1.8.5 on x86_64 Fedora 15 in case that matters.)


This seems to do what I wanted without an external dependency:

function multiply_uint32(a, b) {
    var ah = (a >> 16) & 0xffff, al = a & 0xffff;
    var bh = (b >> 16) & 0xffff, bl = b & 0xffff;
    var high = ((ah * bl) + (al * bh)) & 0xffff;
    return ((high << 16)>>>0) + (al * bl);
}

This performs a 32-bit multiply modulo 2^32, which is the correct bottom half of the computation. A similar function could be used to calculate a correct top half and store it in a separate integer (ah * bh seems right), but I don't happen to need that.

Note the zero-shift. Without that the function generates negative values whenever the high bit is set.


From a forum post:

there's no need to make numbers small, it only matters to keep the number of significant digits below 53

function mult32s(n, m) //signed version
{
    n |= 0;
    m |= 0;
    var nlo = n & 0xffff;
    var nhi = n - nlo;
    return ( (nhi * m | 0) + (nlo * m) ) | 0;
}

function mult32u(n, m) //unsigned version
{
    n >>>= 0;
    m >>>= 0;
    var nlo = n & 0xffff;
    var nhi = n - nlo;
    return ( (nhi * m >>> 0) + (nlo * m) ) >>> 0;
}

Both | and >>> operators cause the result to be converted to 32-bit integer. In the first case it is converted to a signed integer, in the second case it is converted to an unsigned integer.

In the line of multiplication the first | / >>> operator causes the 64-bit intermediate result with 48-bit significand (in the form 0x NNNN NNNN NNNN 0000) to drop its higher bits, so the intermediate result is in the form 0x NNNN 0000.
The second | / >>> operator causes the result of second-multiplication-and-addition to be limited to 32 bits.

In case one of the multiplicands is a constant you can simplify the multiplication further:

function mult32s_with_constant(m) //signed version
{
    m |= 0
    //var n = 0x12345678;
    var nlo = 0x00005678;
    var nhi = 0x12340000;
    return ( (nhi * m | 0) + (nlo * m) ) | 0;
}

Or, if you know that the result is going to be less than 53 bits, then you can do just:

function mult32s(n, m) //signed version
{
    n |= 0;
    m |= 0;
    return ( n * m ) | 0;
}


You'll likely need to make use of a third-party Javascript library to handle large-number precision.

For example, BigInt.js: http://www.leemon.com/crypto/BigInt.js


You are correct. Javascript integers are treated as floats, which have poor precision when dealing with ints.

In javascript, it is the case that 10000000000000001%2 == 0

A friend also mentions that 10000000000000001 == 10000000000000000, and that this is indeed due to the spec (though ints are used for optimization, the spec still requires float-like behavior).

Though once you're in this territory, you're already nearly at the limit of 64bit int precision.


GWT has a emulation of the Java (64-bit) signed long integer type. I made a JavaScript interface for it, here's a demo. Using the default numbers, you can see that the value is the same as the one you'd get in C.

The hex value in the "emulation" column should correspond with what you see in your debugger, but there may be problems with the hex representation since I'm using native JavaScript to make it. It could be done in GWT too of course, that would probably make it more correct. If the JavaScript Number can represent all the string representations that GWT generates, the hex representation should be correct too. Look in the source for usage. The reason they ({sub,mod,div,mul}ss) take strings is because I don't know how to make a GWT Long object from JavaScript.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜