开发者

Why does implicit conversion to int convert and truncate decimal?

Code:

void Main()
{
    C.F();
}
public class C
{
    public static void F()
    {
        var a = new A { i = 1, d = 2.5m };
        var b = new B(a);
        I(b);
        D(b);
    }
    static void I(int i) { Console.WriteLine("int is: " + i); }  
    static void D(decimal d) { Console.WriteLine("decimal is: " + d); }
}
public class A
{
    public int i;
    public decimal d;
}
public class B
{
    A _a;
    public B(A a) { _a = a; }
    public static implicit operator int(B b) { return b._a.i; }
    public static implicit operator decimal(B b) { retur开发者_开发问答n b._a.d; }
}

OUTPUT: int is: 1 decimal is: 2.5

Comment out:

//public static implicit operator decimal(B b) { return b._a.d; }

OUTPUT: int is: 1 decimal is: 1

What is going on when the second version runs and outputs 1 for both cases?


My guess is that the compiler sees that there is an implicit conversion from B to int, and an implicit (built-in) conversion from int to decimal, so that it can use both in sequence. In other words, the call becomes D((decimal)(int)b).

Note that nothing is being truncated; rather, an int is being promoted to a decimal. If instead you comment out the int conversion, I expect that I(b) will fail, since even though there is an implicit cast from B to decimal, there is no implicit cast from decimal to int.


when you comment that line out it takes the int operator because there is an implicit conversion of int to decimal...

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜