really funky C# compiler behavior with nullable literals
The C# 4.0 compiler does not complain about this (not even a 开发者_高级运维warning):
if(10.0 > null + 1)
{
}
if (myDoubleValue > null)
{
}
And it seems to be always false. What is going on here? Is null automatically converted to Nullable<double>
or something?
If so why doesn't this work then:
double myDoubleValue = null + 1;
Also, why would I ever want such a behavior. Why is it a good thing that it is possible playing around with literals like this.
The reason the assignment doesn't work is that the result is of type double?
, not double
. The result will always be the null value for the double?
(aka Nullable<double>
) type.
Oh, and both of the first two blocks should make the compiler complain, with warnings like this:
Test.cs(7,19): warning CS0458: The result of the expression is always 'null' of type 'int?'
Test.cs(12,13): warning CS0464: Comparing with null of type 'double?' always produces 'false'
Managing to compile without errors isn't the same as not complaining :)
if(10.0 > null + 1)
{
}
is actually equivalent to:
int? i = null + 1; // i.HasValue = false in this case
if(10.0 > i)
{
}
so you are actually trying to compare a non nullable type to a nullable type which doesn't have value.
double myDoubleValue = null + 1;
doesn't compile because the type on the right hand is int?
and not double
.
Also is this question just out of curiosity in attempt to #%=^ the compiler or you are actually writing something like this in a real project?
精彩评论