Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Even with a decimal representation, there are situations where you have to remember to round to a cent.

Decimals are software libs; what could go wrong?

More people use the floating-point instructions of a popular CPU than any given decimal library.

If you're starting from scratch, it's probably a lot less work to write (test and debug) a Money class based on floating-point, whose overloaded math operations do the required rounding (so that code using the class cannot forget) than to make a Money class based on decimal arithmetic.

(The last time I wrote an accounting system, I made a Money class based on integers. It could be retargeted to other representations easily. I could make the change and compare the ledger totals and other reports to see if there is a difference.)



> Decimals are software libs; what could go wrong?

Bugs in libraries do exist, but it's much easier to fix a bug in one place, than to track down every single line where floating point operations could misbehave.


> More people use the floating-point instructions of a popular CPU than any given decimal library.

Perhaps, but how many of the former are in a position to notice accuracy issues? I have more faith in a reputable decimal library than your average FPU, frankly.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: