How to get over losing pennies all the time?
I will be working for a company where they are programming for Financial Institutions and I will be working with money a lot. Before that it wasn't a major concern for me because I've been doing small money things and Double was enough at some point but now even 1 penny is darn important.
I think everyone may know the
Dim mValue as Decimal = 100 'seems the best type for now
mValue = mValue / 3
Console.WriteLine(mValue)
Console.ReadLine()
mValue = mValue * 3
Console.WriteLine(mValue)
'Outputs 99.9999999999999999999999999... and 开发者_如何转开发never hits 100
So how can I get over this problem and find even better precise results?
Thanks.
If you need to be able to perform arbitrary division, it sounds like you'll need a representation of a rational number (a fraction). I don't believe .NET provides one by default, and you'll have, um, "fun" implementing it in a way which performs as well as the built-in numeric types. That may get you going for addition, subtraction, multiplication and division - but it won't help if you need irrational numbers too (e.g. taking square roots).
Are you sure you're going to be dividing by 3 though? Are you sure you can't round to (say) the nearest 1000th of a penny? You should find out what the exact requirements are around your calculations - I doubt that you really need to be infinitely precise.
When dealing with money you should never divide like that. You should allocate an amount as evenly as possible, getting a list of allocations as a result, e.g.:
100 allocated by 3 = [ 33, 33, 34 ]
Someone must get that extra penny.
Also, it's simpler to use an integer that represents the number of pennies than to use a decimal number.
And, encapsulate money operations is a dedicated class. Take a look at the Money pattern.
Can you offer me a real life use case example for this?
mValue = mValue / 3
Divide $100 between three bank robbers?
Well, you cannot give $33.(3) to each, $0.01 will still have to go "nowhere" (if you don't want them to kill each other).
I'm only saying this because money ain't math, and you need to be sure about the required precision.
Not more, and not less.
I'm not an expert in this field so someone with more experience might answer this better.
You can pretty safely round to 10000 digets or so, and then round to the nearest penny when you want to use the number. This works for the example you posted
100/ 3 = 33.333333333333333 * 3 = 99.99999999999 which is then rounded to 100.
I know that many things use whats called bankers' rounding. This is to prevent losing money when something ends with exactly 5. It rounds up if the number before the 5 is odd, and it rounds down if the number is even. Something to consider.
http://en.wikipedia.org/wiki/Rounding
What is important is to use precisely-defined rounding rules for a particular application. If a billing application specifies that all line item amounts will be rounded to the nearest penny, then an invoice total should reflect that. If it specifies that all calculations will be performed to a hundredth of a penny, then the invoice should probably reflect that (perhaps formatting the fractional pennies as little superscripts, the way gas prices do). In some cases, one might allocate fractional pennies among invoice items on an ongoing basis, e.g.
old_total = total old_rounded_total = rounded_total total = total + line_item rounded_total = round_to_penny(total) displayed_line_item_cost = rounded_total - old_rounded_total
In other cases, one might round all invoice lines to the nearest penny, subtract from that total the sum of all non-rounded invoice lines, and adjust up or down those lines which were closest to the "penny" boundary.
If the rounding semantics are well-defined, and the application follows them, the results will precisely match the specification. If the semantics are not well-defined, the results won't be either.
精彩评论