Something happened today which shook the very foundations of what I’ve always believed about computers. See, maybe this was just a crazy notion, but I was always under the impression that if there was ONE thing computers did well, it was math. Simple math, algebra, geometry, calculus… it didn’t matter. Computers have always been equation solving machines. Or so I thought.

As it so happened, I was catching up on three months of procrastinated Quicken transactions and I had a slight discrepancy in my numbers. I typed in Command-Space “cal” to launch the built-in Apple calculator via LaunchBar in order to check my figures. Here is the equation I typed in:

`9533.24-215.10`

… and here is the garbage Apple babbled back at me: `9318.139999999999`

What? How is that possible? I’m subtracting two decimal numbers and the result is a repeating decimal? Thinking something was wrong, I began experimenting by simplifying the equation:

`9533.24-.1`

Result: `9533.139999999999`

Convinced I had the calculator in some whacked-out Reverse Polish mode or something, I began checking the menus. The only relevant menu item was a setting called “Precision” which went from 0 to 16 and was defaulted at 12. How about Precision “Infinity”? I want my damned calculator to be precise enough to subtract simple decimals and apparently 12 isn’t enough to do this. As it turns out, “Precision” is a bit of a misnomer for this setting because it just represents how many decimals you want to see before the number gets rounded. Anyway, that still doesn’t explain why an equation which needs no rounding to begin with is giving me a repeating decimal.

Upon more experimentation, I discovered the following:

- The error doesn’t seem occur on numbers less than 1000.
- The error only occurs on some numbers greater than 1000.
- The error doesn’t seem to occur on addition, but only subtraction.
- The principal software engineer at my company couldn’t tell me how this was even possible.

And so there you have it… what was once simple is now apparently difficult again, thanks to the otherwise brilliant piece of engineering that is OS X Panther. I’m sure the explanation has something to do with floating-point calculations, whatever the hell those are, but that doesn’t make this bug the least bit more acceptable. My worst nightmare is that the repeating decimal answer actually *is* the correct answer from a computing standpoint but most computers are smart enough to round it for us, knowing what we really want. That would really alter my perceptions of low-level computing quite a bit.

On the bright side, we finally found something PCs are better than Macs at.

Subtraction.