开发者

C# unit testing number precision questions

I am testing basic math f开发者_StackOverflowunctions that'll return its mean/variance/standard deviation. The problem I am facing is that I cannot get the precision of "expected value" to math what is returned by the function. For example, if the variance function returns 50.5555555555566, even if I set the expected value explicitly to 50.5555555555566, it'll say they're two different doubles and the unit test fails.

Below is the actual output from the unit test:

Assert.AreEqual failed. Expected:<50.5555555555556>. Actual:<50.5555555555566>.

Can anyone advise on a way around this? I am using the built-in visual studio unit testing suite. Thanks.


Floating point (Single/Double)numbers need to be tested with a tolerance value. So you can say if the two numbers are within 0.0001 (tolerance) of each other, consider them equal

In NUnit, you have comparison asserts. e.g. the following overload of AreEqual, find the equivalent one for MSTest..

Assert.AreEqual( double expected, double actual, double tolerance,
                 string message );

Update: This should be the method you need in MSTest. Try and see if it resolves your issue..

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜