LINQ Aggregate behavior of nullable types
Can someone explain what开发者_Python百科 is going on here? Howcome both these things are true?
[TestMethod]
public void WhatIsGoingOnHere()
{
List<int?> list = new List<int?> { 1, 2, 3, null, 5, 6 };
Assert.AreEqual(17, list.Sum());
int? singleSum = 1 + 2 + 3 + null + 5 + 6;
Assert.IsNull(singleSum);
}
Specifically, why does the Sum() method not return 'null'? Or the singleSum not equal 17?
The .Sum()
method of nullable types ignores all null
values:
MSDN: Enumerable.Sum Method (IEnumerable<Nullable<Int32>>)
Remarks
The result does not include values that are null.
Whereas if you add any number via +
to null
the result is null
.
As Andrew Hare noted, it doesn't make sense to add a number to null
: null
is not 0
; it is just not a number.
What you are seeing is the difference between using Enumerable.Sum
and actually adding the values yourself.
The important thing here is the null
is not zero. At first glance you would think that singleSum
should equal 17 but that would mean that we would have to assign different semantics to null
based on the data type of the reference. The fact that this is an int?
makes no difference - null
is null
and should never be semantically equal with the numeric constant 0
.
The implementation of Enumerable.Sum
is designed to skip over any value that is null
in the sequence so that is why you are seeing the different behavior between the two tests. However the second test rightly returns null
as the compiler is smart enough to know that adding anything to null
yields null
.
Here is the implementation of Enumerable.Sum
that accepts a parameter of int?
:
public static int? Sum(this IEnumerable<int?> source)
{
if (source == null)
{
throw Error.ArgumentNull("source");
}
int num = 0;
foreach (int? nullable in source)
{
// As you can see here it is explicitly designed to
// skip over any null values
if (nullable.HasValue)
{
num += nullable.GetValueOrDefault();
}
}
return new int?(num);
}
精彩评论