开发者

Data conversion java

char c='c';
int开发者_Python百科 i=10;
double d =50;
long l=30;
String s="Goodbye";

Are these statement valid?

s+=i;
i+=s;
c+=s;
c=c+i;

Can someone explain the logic of converting data types


Why don't you give it a try:

bash-3.2$ cat ConveraionTest.java 
    public class ConvertsonTest {
        public static void main( String [] args ) {
            char c='c';
            int i=10;
            double d =50;
            long l=30;
            String s="Goodbye";

            //Are these statement valid?

            s+=i;
            i+=s;
            c+=s;
            c=c+i;
        }
    }
bash-3.2$ javac ConversionTest.java 
ConversionTest.java:12: incompatible types
found   : int
required: java.lang.String
            i+=s;
            ^
ConversionTest.java:13: incompatible types
found   : char
required: java.lang.String
            c+=s;
            ^
ConversionTest.java:14: possible loss of precision
found   : int
required: char
            c=c+i;
               ^
3 errors

EDIT

Long history

Basically, all the types in java have a "shape" if you want to call it like that ( well I'm going to call it like that for this answer )

For the primitives data types ( boolean, byte, short, char, int, float, long, double ) the "shape" is the size in bytes it uses ( or in bits, here 1 byte = 8 bits ) :

boolean = true or false  
byte = 8 bits
short = 16 bits
char = 16 bits
int = 32 bits
float = 32 bits
long = 64 bits
double = 64 bits

The "shape" of objects varies according to it class.

So, basically you can assign anything to anything as long as they fit in the "shape"

So you can assign an int to a long ( you can thing 32 bits fits into 64 bits ) a short(16) into a int(32) etc.

What you can't do is to assign something that doesn't fit in the shape.

So

ConversionTest.java:12: incompatible types
found   : int
required: java.lang.String
            i+=s;
            ^

You can't assign a String into an int. How would you? Where would the contents go? They are not of the same "shape", nor even a compatible one.

Same goes for String to char

ConversionTest.java:13: incompatible types
found   : char
required: java.lang.String
            c+=s;
            ^

Now, you might assign an int(32 bits) to a char(16 bits) or to a short(16 bits) The problem would be, that if the value holds > than 16 bits ( 131 071 for instance )

You would lose the bits that do not fit into 16 bits. That's why you get this error:

ConversionTest.java:14: possible loss of precision
found   : int
required: char
            c=c+i;

However if you are sure that it fits ( for instance int i = 65; which certainly fits into 16 bits ) you can cast it, like this:

 int i = 65;
 char c = ( char ) i;

Casting it the way you tell the compiler:

Hey I'm the programmer here, I know what I'm doing.


Yes, no, no, no (unless you explicitly perform a typecast). If you were to write up a simple main method, compile it, and execute it, you could have seen this - these problems should be identified by the compiler.

This page on Java primitive data types explains it pretty well.


char c='c';
int i=10;
double d =50;
long l=30;
String s="Goodbye";

s+=i; // legal :)
i+=s; // not legal :( The operator += is undefined for the argument types int, String
c+=s; // not legal :( The operator += is undefined for the argument types char, String
c=c+i; // not legal :( Type Mismatch: cannot convert from int to char

The complete explanation of Java data type conversions is long and detailed.


There are two types of conversions:widening conversions and narrowing conversions. Widening conversions are allowed and Java will handle it for you, but narrowing conversions are not allowed. Widening conversions mean that you are converting a "smaller" value such as int (32 bits) to a "larger" value such as long (62 bits). Narrowing conversions which go the other way will have to be done explicitly.

 s+=i;

will require an int to be converted to a String which is allowed.

i+=s;

Will require a String to be converted to an int which is not allowed. The += operator will translate to

i = i + s;

and i + s will return a String which cannot be assigned to an int.

c+=s;

This cannot be allowed for a similar reason that c + s returns a String which you are trying to assign to a char.

c=c+i;

will also give an error because c + i will result in an int (32 bits) and assigning it to a char (16 bits) may cause loss of precision.

Each of the operations you try are actually possible but you have to explicitly tell Java that you want to do themn and will accept the consequences. Having said that mixed type operations are frowned upon in the totally pure hard nosed programming arena since there are edge cases that potentially cause problems.


s += i will concatenate s and string "10", this is equal to s += ((Integer)i).toString();

i += s don't think this will work, types are incompatible

c += s also shouldn't compile, same, incompatiple types.

c = c + i should add 10 to ascii value of c, to c will become 10th letter after 'c' => 'm', i guess

EDIT. So in last case you have to cast i to char to make it compile.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜