I think the Java/Script logic here is that when you divide a number by a number between 0 and 1, the result is bigger (it actually multiplies instead of dividing) as the denominator approaches 0. So the limit as that denominator approaches 0 is infinity as the number only grows larger and larger the smaller the denominator gets (and closer to 0 it gets).
So 1.00/0 = +Infinity according to Java/Script
But I don't understand why that only happens with doubles but not integers anyways. Not that it is any more accurate but the result should be NaN/Undefined.