Because NaN is defined as a number and two equal numbers divided by themselves equal 1

> two equal numbers divided by themselves equal 1

That's not true. For example: 0 == 0, but 0/0 != 1.

(See also +Infinity, -Infinity, and -0.)

If you're going to nitpick this comment, you should note that infinity isn't on the number line and infinity != infinity, and dividing by zero is undefined

We're commenting on an article about IEEE 754 floating point values. Following the IEEE 754 standard, we have:

    >> isNaN(Infinity)
    false
    >> Infinity == Infinity
    true
    >> Infinity / Infinity == 1
    false

    >> isNaN(0)
    false
    >> 0 == 0
    true
    >> 0 / 0 == 1
    false
Also, you say NaN ("not a number") is "defined as a number" but Infinity is not. I would think every IEEE 754 value is either "a number" or "not a number". But apparently you believe NaN is both and Infinity is neither?

And you say 0 / 0 is "undefined" but the standard requires it to be NaN, which you say is "defined".

It doesn't really matter if NaN is technically a number or not. I find the standard "NaN == NaN is true" to be potentially reasonable (though I do prefer the standard "NaN == Nan is false"). Regardless of what you choose NaN/NaN = 1 is entirely unacceptable.