I think the Java/Script logic here is that when you divide a number by a number between 0 and 1, the result is bigger (it actually multiplies instead of dividing) as the denominator approaches 0. So the limit as that denominator approaches 0 is infinity as the number only grows larger and larger the smaller the denominator gets (and closer to 0 it gets).

So 1.00/0 = +Infinity according to Java/Script
:woozy_baa:

But I don't understand why that only happens with doubles but not integers anyways. Not that it is any more accurate but the result should be NaN/Undefined.

Follow

@enigmatico That's how it's defined in the IEEE standard. Integer division by zero is a hardware exception in x86, but JS only has floats
stackoverflow.com/questions/33

· · Web · 0 · 0 · 0
Sign in to participate in the conversation
Game Liberty Mastodon

Mainly gaming/nerd instance for people who value free speech. Everyone is welcome.