Yes, children do learn that. But on the calculator, they're inputting numbers in decimal, and it's decimal internally. In programming, we input numbers in decimal, and even write them that way in source code, but the actual math is all binary - thus, there's a disconnect between the common sense expectation of what (0.1 + 0.2) ought to do, and what it actually does. Someone coming from a calculator would not expect that to be unequal to 0.3, unlike the situation with square roots.