Can you explain this Math.Log10 vs. BigInteger.Log10 behavior?
Can somebody explain the following System.Numerics.BigInteger
behavior?
Console.WriteLine(Math.Log10(100)); // prints 2
Console.WriteLine(Math.Log10(1000)); // prints 3 (as expected)
Console.WriteLine((int)Math.Log10(100)); // prints 2
Console.WriteL开发者_开发百科ine((int)Math.Log10(1000)); // prints 3 (as axpected)
var bi100 = new BigInteger(100);
var bi1000 = new BigInteger(1000);
Console.WriteLine(BigInteger.Log10(bi100)); // prints 2
Console.WriteLine(BigInteger.Log10(bi1000)); // prints 3 (as axpected)
Console.WriteLine((int)BigInteger.Log10(bi100)); // prints 2
Console.WriteLine((int)BigInteger.Log10(bi1000)); // prints 2 ???????
Console.WriteLine(Math.Floor(BigInteger.Log10(bi100))); // prints 2
Console.WriteLine(Math.Floor(BigInteger.Log10(bi1000))); // prints 2 ???????
Console.WriteLine(Math.Round(BigInteger.Log10(bi100))); // prints 2
Console.WriteLine(Math.Round(BigInteger.Log10(bi1000))); // prints 3 (as expected)
EDIT: Please note that I know that it's a rouding problem. I want to know why the behavior of Math.Log10 and BigInteger.Log10 differs.
It is due to precision and rounding.
This line:
Console.WriteLine((int)BigInteger.Log10(bi1000));
is rounding down the value 2.9999999999999996 to 2, whereas Console.WriteLine
is writing this out as 3
You can verify this using an intermediate double
variable, and inspecting its value:
double x = BigInteger.Log10(bi1000);
Console.WriteLine((int)x);
The big difference is that BigInteger.Log10(x)
is implemented as Math.Log(x)/Math.Log(10)
, whereas Math.Log10(x)
is implemented differently (it's an extern
so it's not easy to figure out). Regardless, it should be obvious that they use slightly different algorithms for doing a base-10 logarithm, which causes a slight difference in output.
The behaviour differs because they are different types with different representations and different implementations.
精彩评论