开发者

Why is the resulting type of a division of short integers in Java not a short integer?

Consider this code:

public class ShortDivision {
    public static void main(String[] args) {
   开发者_开发百科     short i = 2;
        short j = 1;
        short k = i/j;
    }
}

Compiling this produces the error

ShortDivision.java:5: possible loss of precision
found   : int
required: short
        short k = i/j;

because the type of the expression i/j is apparently int, and hence must be cast to short.

Why is the type of i/j not short?


From the Java spec:

5.6.2 Binary Numeric Promotion

When an operator applies binary numeric promotion to a pair of operands, each of which must denote a value of a numeric type, the following rules apply, in order, using widening conversion (§5.1.2) to convert operands as necessary:

If either operand is of type double, the other is converted to double.

Otherwise, if either operand is of type float, the other is converted to float.

Otherwise, if either operand is of type long, the other is converted to long.

Otherwise, both operands are converted to type int.

For binary operations, small integer types are promoted to int and the result of the operation is int.


EDIT: Why is it like that? The short answer is that Java copied this behavior from C. A longer answer might have to do with the fact that all modern machines do at least 32-bit native computations, and it might actually be harder for some machines to do 8-bit and 16-bit operations.

See also: OR-ing bytes in C# gives int


Regarding the motivation: lets imagine the alternatives to this behaviour and see why they don't work:

Alternative 1: the result should always be the same as the inputs.

What should the result be for adding an int and a short?

What should the result be for multiplying two shorts? The result in general will fit into an int, but because we truncate to short, most multiplications will fail silently. Casting to an int afterwards won't help.

Alternative 2: the result should always be the smallest type that can represent all possible outputs.

If the return type were a short, the answer would not always be representable as a short.

A short can hold values -32,768 to 32,767. Then this result will cause overflow:

short result = -32768 / -1; // 32768: not a short

So your question becomes: why does adding two ints not return a long? What should multiplication of two ints be? A long? A BigNumber to cover the case of squaring integer min value?

Alternative 3: Choose the thing most people probably want most of the time

So the result should be:

  • int for multiplying two shorts, or any int operations.
  • short if adding or subtracting shorts, dividing a short by any integer type, multiplying two bytes, ...
  • byte if bitshifting a byte to the right, int if bitshifting to the left.
  • etc...

Remembering all the special cases would be difficult if there is no fundamental logic to them. It's simpler to just say: the result of integer operations is always an int.


It just a design choice to be consistent with C/C++ which were dominate languages when Java was designed.

For example, i * j could be implemented so the type is promoted from byte => short, short => int, and int => long, and this would avoid overflows but it doesn't. (It does in some languages) Casting could be used if the current behaviour was desired, but the loss of some bits would be clear.

Similarly i / j could be prompted from byte/short => float or int/long => double.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜