And this is what the Devil does - JavaScript
JavaScript our Lord and master and it's broken as fuck math because of the desire to treat all numbers as floating point.
I know exactly why Brendan Eich did this. It's easier to build a single number type at the instruction level because I took a compiler/architecture class so build the harder pipeline of float that covers float and integers. This is the same decision I made when building my own instruction set, I digress and it allowed me to make the floating point operations faster than other classmates instruction sets (as implementing different pipelines requires building the instruction set probable caches etc) with a small penalty on integer operations that you wouldn't notice at first glance.
Example:
parseFloat('1.42') + parseFloat('2.10'); = 3.52
👍Good
parseFloat('1.41') + parseFloat('2.1'); = 3.51
👏 Things are still working
parseFloat('1.41') + parseFloat('2.11'); = 3.5199999999999996
🤨🤬️ Jesus Christ.
What is this garbage?
What happened is that since both numbers have the same precision and are floats they both have trailing hidden mantissas that fractionally shift the sum result. This doesn't happen if you end either of the numbers on a 0
or if you have a single digit precision the trailing 0
is inferred as well.